Dec 16 14:56:58 crc systemd[1]: Starting Kubernetes Kubelet... Dec 16 14:56:58 crc restorecon[4701]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:58 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:59 crc restorecon[4701]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 16 14:56:59 crc restorecon[4701]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 16 14:56:59 crc kubenswrapper[4728]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 14:56:59 crc kubenswrapper[4728]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 16 14:56:59 crc kubenswrapper[4728]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 14:56:59 crc kubenswrapper[4728]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 14:56:59 crc kubenswrapper[4728]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 16 14:56:59 crc kubenswrapper[4728]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.300178 4728 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305841 4728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305888 4728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305898 4728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305907 4728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305918 4728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305927 4728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305936 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305944 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305953 4728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305962 4728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305969 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305977 4728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305985 4728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305992 4728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.305999 4728 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306007 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306017 4728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306027 4728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306036 4728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306044 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306051 4728 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306060 4728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306072 4728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306081 4728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306089 4728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306097 4728 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306104 4728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306112 4728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306119 4728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306127 4728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306135 4728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306143 4728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306150 4728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306158 4728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306165 4728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306173 4728 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306180 4728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306194 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306203 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306210 4728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306218 4728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306226 4728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306234 4728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306242 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306250 4728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306258 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306265 4728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306276 4728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306286 4728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306295 4728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306303 4728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306311 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306319 4728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306329 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306337 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306345 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306353 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306361 4728 feature_gate.go:330] unrecognized feature gate: Example Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306369 4728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306377 4728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306385 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306395 4728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306432 4728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306441 4728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306451 4728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306459 4728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306467 4728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306476 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306484 4728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306493 4728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.306501 4728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.306878 4728 flags.go:64] FLAG: --address="0.0.0.0" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.306897 4728 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.306914 4728 flags.go:64] FLAG: --anonymous-auth="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.306925 4728 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.306938 4728 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.306947 4728 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.306959 4728 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.306970 4728 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.306979 4728 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.306988 4728 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.306998 4728 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307008 4728 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307017 4728 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307027 4728 flags.go:64] FLAG: --cgroup-root="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307036 4728 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307045 4728 flags.go:64] FLAG: --client-ca-file="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307054 4728 flags.go:64] FLAG: --cloud-config="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307063 4728 flags.go:64] FLAG: --cloud-provider="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307071 4728 flags.go:64] FLAG: --cluster-dns="[]" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307084 4728 flags.go:64] FLAG: --cluster-domain="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307093 4728 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307102 4728 flags.go:64] FLAG: --config-dir="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307111 4728 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307121 4728 flags.go:64] FLAG: --container-log-max-files="5" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307133 4728 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307142 4728 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307151 4728 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307160 4728 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307169 4728 flags.go:64] FLAG: --contention-profiling="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307178 4728 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307187 4728 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307197 4728 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307205 4728 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307216 4728 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307225 4728 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307235 4728 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307244 4728 flags.go:64] FLAG: --enable-load-reader="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307253 4728 flags.go:64] FLAG: --enable-server="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307269 4728 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307280 4728 flags.go:64] FLAG: --event-burst="100" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307290 4728 flags.go:64] FLAG: --event-qps="50" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307299 4728 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307308 4728 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307317 4728 flags.go:64] FLAG: --eviction-hard="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307327 4728 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307336 4728 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307345 4728 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307354 4728 flags.go:64] FLAG: --eviction-soft="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307364 4728 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307373 4728 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307382 4728 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307391 4728 flags.go:64] FLAG: --experimental-mounter-path="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307400 4728 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307439 4728 flags.go:64] FLAG: --fail-swap-on="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307449 4728 flags.go:64] FLAG: --feature-gates="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307459 4728 flags.go:64] FLAG: --file-check-frequency="20s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307469 4728 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307479 4728 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307488 4728 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307497 4728 flags.go:64] FLAG: --healthz-port="10248" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307506 4728 flags.go:64] FLAG: --help="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307516 4728 flags.go:64] FLAG: --hostname-override="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307525 4728 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307534 4728 flags.go:64] FLAG: --http-check-frequency="20s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307542 4728 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307551 4728 flags.go:64] FLAG: --image-credential-provider-config="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307560 4728 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307569 4728 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307577 4728 flags.go:64] FLAG: --image-service-endpoint="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307588 4728 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307597 4728 flags.go:64] FLAG: --kube-api-burst="100" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307607 4728 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307617 4728 flags.go:64] FLAG: --kube-api-qps="50" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307626 4728 flags.go:64] FLAG: --kube-reserved="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307636 4728 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307645 4728 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307655 4728 flags.go:64] FLAG: --kubelet-cgroups="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307664 4728 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307673 4728 flags.go:64] FLAG: --lock-file="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307682 4728 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307692 4728 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307701 4728 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307715 4728 flags.go:64] FLAG: --log-json-split-stream="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307724 4728 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307733 4728 flags.go:64] FLAG: --log-text-split-stream="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307742 4728 flags.go:64] FLAG: --logging-format="text" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307752 4728 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307762 4728 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307771 4728 flags.go:64] FLAG: --manifest-url="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307780 4728 flags.go:64] FLAG: --manifest-url-header="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307792 4728 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307801 4728 flags.go:64] FLAG: --max-open-files="1000000" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307811 4728 flags.go:64] FLAG: --max-pods="110" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307822 4728 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307831 4728 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307846 4728 flags.go:64] FLAG: --memory-manager-policy="None" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307854 4728 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307863 4728 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307873 4728 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307882 4728 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307900 4728 flags.go:64] FLAG: --node-status-max-images="50" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307909 4728 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307918 4728 flags.go:64] FLAG: --oom-score-adj="-999" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307928 4728 flags.go:64] FLAG: --pod-cidr="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307937 4728 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307950 4728 flags.go:64] FLAG: --pod-manifest-path="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307959 4728 flags.go:64] FLAG: --pod-max-pids="-1" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307968 4728 flags.go:64] FLAG: --pods-per-core="0" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.307978 4728 flags.go:64] FLAG: --port="10250" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308023 4728 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308034 4728 flags.go:64] FLAG: --provider-id="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308044 4728 flags.go:64] FLAG: --qos-reserved="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308053 4728 flags.go:64] FLAG: --read-only-port="10255" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308062 4728 flags.go:64] FLAG: --register-node="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308071 4728 flags.go:64] FLAG: --register-schedulable="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308080 4728 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308094 4728 flags.go:64] FLAG: --registry-burst="10" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308103 4728 flags.go:64] FLAG: --registry-qps="5" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308112 4728 flags.go:64] FLAG: --reserved-cpus="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308121 4728 flags.go:64] FLAG: --reserved-memory="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308132 4728 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308141 4728 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308150 4728 flags.go:64] FLAG: --rotate-certificates="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308159 4728 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308168 4728 flags.go:64] FLAG: --runonce="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308177 4728 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308189 4728 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308201 4728 flags.go:64] FLAG: --seccomp-default="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308210 4728 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308219 4728 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308229 4728 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308239 4728 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308248 4728 flags.go:64] FLAG: --storage-driver-password="root" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308258 4728 flags.go:64] FLAG: --storage-driver-secure="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308267 4728 flags.go:64] FLAG: --storage-driver-table="stats" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308276 4728 flags.go:64] FLAG: --storage-driver-user="root" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308307 4728 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308319 4728 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308331 4728 flags.go:64] FLAG: --system-cgroups="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308343 4728 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308361 4728 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308370 4728 flags.go:64] FLAG: --tls-cert-file="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308379 4728 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308391 4728 flags.go:64] FLAG: --tls-min-version="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308427 4728 flags.go:64] FLAG: --tls-private-key-file="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308436 4728 flags.go:64] FLAG: --topology-manager-policy="none" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308446 4728 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308455 4728 flags.go:64] FLAG: --topology-manager-scope="container" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308464 4728 flags.go:64] FLAG: --v="2" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308476 4728 flags.go:64] FLAG: --version="false" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308488 4728 flags.go:64] FLAG: --vmodule="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308498 4728 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.308508 4728 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308792 4728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308805 4728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308815 4728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308824 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308833 4728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308847 4728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308860 4728 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308869 4728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308877 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308885 4728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308893 4728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308902 4728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308910 4728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308918 4728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308926 4728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308934 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308941 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308949 4728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308957 4728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308967 4728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308976 4728 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308984 4728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.308992 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309000 4728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309008 4728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309015 4728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309023 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309030 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309038 4728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309049 4728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309059 4728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309067 4728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309075 4728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309083 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309091 4728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309099 4728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309106 4728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309118 4728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309129 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309136 4728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309145 4728 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309153 4728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309160 4728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309168 4728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309176 4728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309183 4728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309192 4728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309200 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309207 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309215 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309223 4728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309231 4728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309241 4728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309251 4728 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309261 4728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309269 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309278 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309286 4728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309294 4728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309302 4728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309310 4728 feature_gate.go:330] unrecognized feature gate: Example Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309318 4728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309325 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309333 4728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309341 4728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309350 4728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309357 4728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309365 4728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309373 4728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309384 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.309394 4728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.309445 4728 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.321115 4728 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.321171 4728 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321732 4728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321752 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321758 4728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321764 4728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321770 4728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321776 4728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321781 4728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321787 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321792 4728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321797 4728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321802 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321808 4728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321812 4728 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321818 4728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321823 4728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321828 4728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321834 4728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321839 4728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321844 4728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321850 4728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321855 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321860 4728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321865 4728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321870 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321875 4728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321880 4728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321885 4728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321891 4728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321896 4728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321901 4728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321907 4728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321912 4728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321917 4728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321928 4728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321933 4728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321938 4728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321947 4728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321955 4728 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321961 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321968 4728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321975 4728 feature_gate.go:330] unrecognized feature gate: Example Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321981 4728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321987 4728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321992 4728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.321998 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322003 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322010 4728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322016 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322022 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322028 4728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322033 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322038 4728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322044 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322049 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322055 4728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322060 4728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322065 4728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322070 4728 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322076 4728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322082 4728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322087 4728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322092 4728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322102 4728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322108 4728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322114 4728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322119 4728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322125 4728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322131 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322136 4728 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322141 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322146 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.322157 4728 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322331 4728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322339 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322344 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322349 4728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322355 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322360 4728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322365 4728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322370 4728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322375 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322380 4728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322385 4728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322390 4728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322397 4728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322458 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322466 4728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322472 4728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322478 4728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322484 4728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322489 4728 feature_gate.go:330] unrecognized feature gate: Example Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322494 4728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322501 4728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322508 4728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322513 4728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322519 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322525 4728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322531 4728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322536 4728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322542 4728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322547 4728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322551 4728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322557 4728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322562 4728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322567 4728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322572 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322577 4728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322582 4728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322588 4728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322593 4728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322598 4728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322603 4728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322608 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322613 4728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322619 4728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322624 4728 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322628 4728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322633 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322638 4728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322644 4728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322649 4728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322654 4728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322659 4728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322664 4728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322669 4728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322674 4728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322679 4728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322684 4728 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322689 4728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322695 4728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322702 4728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322708 4728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322714 4728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322720 4728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322726 4728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322731 4728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322737 4728 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322743 4728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322748 4728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322753 4728 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322759 4728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322764 4728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.322769 4728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.322778 4728 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.323288 4728 server.go:940] "Client rotation is on, will bootstrap in background" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.326883 4728 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.326998 4728 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.327650 4728 server.go:997] "Starting client certificate rotation" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.327680 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.327859 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-11 09:25:45.911187939 +0000 UTC Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.327971 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.334763 4728 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.336511 4728 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.338274 4728 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.348771 4728 log.go:25] "Validated CRI v1 runtime API" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.376309 4728 log.go:25] "Validated CRI v1 image API" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.378200 4728 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.381600 4728 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-16-14-52-24-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.381657 4728 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.416945 4728 manager.go:217] Machine: {Timestamp:2025-12-16 14:56:59.414320965 +0000 UTC m=+0.254500039 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6cdaa06a-6501-425c-95d7-724f6caa86b7 BootID:0442088b-ca61-48c2-99d7-338f049fa924 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:7e:68:8e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:7e:68:8e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:94:de:06 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:98:4f:60 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:48:24:f5 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1e:4d:23 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:72:7c:37:92:53:7a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:92:1b:cd:51:e6:80 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.417442 4728 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.417736 4728 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.418584 4728 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.418912 4728 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.418969 4728 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.419462 4728 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.419492 4728 container_manager_linux.go:303] "Creating device plugin manager" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.419931 4728 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.420009 4728 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.420500 4728 state_mem.go:36] "Initialized new in-memory state store" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.420696 4728 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.421737 4728 kubelet.go:418] "Attempting to sync node with API server" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.421825 4728 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.421876 4728 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.421900 4728 kubelet.go:324] "Adding apiserver pod source" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.421929 4728 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.424484 4728 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.424954 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.424962 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.425118 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.425149 4728 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.425282 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.426958 4728 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.427896 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.427938 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.427954 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.427969 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.427991 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.428005 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.428022 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.428044 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.428061 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.428074 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.428095 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.428109 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.428706 4728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.429608 4728 server.go:1280] "Started kubelet" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.430014 4728 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.432042 4728 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.430047 4728 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 14:56:59 crc systemd[1]: Started Kubernetes Kubelet. Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.435081 4728 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.434287 4728 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.210:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1881b9fba904105b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 14:56:59.429490779 +0000 UTC m=+0.269669803,LastTimestamp:2025-12-16 14:56:59.429490779 +0000 UTC m=+0.269669803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.436454 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.436551 4728 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.436673 4728 server.go:460] "Adding debug handlers to kubelet server" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.436855 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:13:16.259173737 +0000 UTC Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.437666 4728 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.437711 4728 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.437695 4728 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.437894 4728 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.438339 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="200ms" Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.443714 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.443925 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.444531 4728 factory.go:153] Registering CRI-O factory Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.444604 4728 factory.go:221] Registration of the crio container factory successfully Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.444883 4728 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.444935 4728 factory.go:55] Registering systemd factory Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.444974 4728 factory.go:221] Registration of the systemd container factory successfully Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.445017 4728 factory.go:103] Registering Raw factory Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.445048 4728 manager.go:1196] Started watching for new ooms in manager Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.449595 4728 manager.go:319] Starting recovery of all containers Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.461824 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.461952 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.461983 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462007 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462094 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462118 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462143 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462168 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462197 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462221 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462246 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462269 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462299 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462331 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462358 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462382 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462441 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462477 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462500 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462522 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462612 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462640 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462665 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462737 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462768 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462800 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462839 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462866 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462930 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462956 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.462989 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463014 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463041 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463065 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463087 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463108 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463194 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463220 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463241 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463264 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463288 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463310 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463333 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463363 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463453 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463478 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463507 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463528 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463551 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463572 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463593 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463614 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463849 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463879 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463902 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.463926 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464033 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464056 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464077 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464102 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464166 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464243 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464271 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464296 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464319 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464343 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464367 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464391 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464482 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464505 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464527 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464547 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464605 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464625 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464657 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464682 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464794 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464850 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464875 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464898 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464919 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.464979 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465000 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465022 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465075 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465098 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465119 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465140 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465227 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465250 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465282 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465305 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465381 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465402 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465520 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465546 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465569 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465592 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465613 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465640 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465720 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465742 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465766 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465788 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465948 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.465978 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466002 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466028 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466087 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466112 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466136 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466158 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466245 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466267 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466288 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466365 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466448 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466502 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466525 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466575 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466596 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466618 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466639 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466660 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466718 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466742 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466764 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466786 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466807 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466828 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466848 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466868 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.466994 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467022 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467045 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467069 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467092 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467113 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467134 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467156 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467224 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467247 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467268 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467289 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467576 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467728 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467763 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467787 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467810 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467862 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467885 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467910 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.467935 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468007 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468029 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468091 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468113 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468135 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468156 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468180 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468240 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468297 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468319 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468341 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468363 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468385 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468433 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468456 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468479 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468526 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468549 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468571 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468591 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468622 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.468644 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.469673 4728 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.469848 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.469901 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.469924 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.469969 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.469995 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470017 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470040 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470062 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470083 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470144 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470165 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470186 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470207 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470229 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470251 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470272 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470320 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470342 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470364 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470385 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470498 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470525 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470549 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470571 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470593 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470616 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470640 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470663 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470684 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470708 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470741 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470774 4728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470797 4728 reconstruct.go:97] "Volume reconstruction finished" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.470812 4728 reconciler.go:26] "Reconciler: start to sync state" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.483902 4728 manager.go:324] Recovery completed Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.497495 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.500182 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.500292 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.500314 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.501562 4728 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.501594 4728 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.501618 4728 state_mem.go:36] "Initialized new in-memory state store" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.502155 4728 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.505012 4728 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.505053 4728 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.505087 4728 kubelet.go:2335] "Starting kubelet main sync loop" Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.505141 4728 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.509467 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.509603 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.510076 4728 policy_none.go:49] "None policy: Start" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.514244 4728 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.514286 4728 state_mem.go:35] "Initializing new in-memory state store" Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.538714 4728 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.570063 4728 manager.go:334] "Starting Device Plugin manager" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.570159 4728 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.570221 4728 server.go:79] "Starting device plugin registration server" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.570968 4728 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.570997 4728 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.571637 4728 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.571896 4728 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.571917 4728 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.579587 4728 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.606169 4728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.606396 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.608454 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.608520 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.608540 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.608737 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.609188 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.609322 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.609841 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.609890 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.609908 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.610070 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.610317 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.610443 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.610838 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.610908 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.610934 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.611478 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.611560 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.611578 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.611814 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.611890 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.611927 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.611946 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.612570 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.612651 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.613167 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.613466 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.613505 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.613726 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.614128 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.614222 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.615135 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.615210 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.615229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.615279 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.615315 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.615332 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.615639 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.615714 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.616223 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.616286 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.616314 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.617355 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.617396 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.617444 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.639887 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="400ms" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.671943 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.673560 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.673616 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.673638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.673676 4728 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.673689 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.673764 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.673802 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674071 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674177 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674261 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674296 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674373 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.674388 4728 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.210:6443: connect: connection refused" node="crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674434 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674520 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674563 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674601 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674629 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674662 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.674683 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.776676 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.776849 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.776937 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777010 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777047 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777111 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777147 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777212 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777246 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777312 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777344 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777448 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777481 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777544 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777583 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777866 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777897 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777953 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777954 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.778013 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777992 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.778047 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.778036 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.778029 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.777995 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.778144 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.778147 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.778130 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.778192 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.778263 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.875237 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.877804 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.877907 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.877931 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.878017 4728 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 14:56:59 crc kubenswrapper[4728]: E1216 14:56:59.879015 4728 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.210:6443: connect: connection refused" node="crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.941096 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.949193 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.965774 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.986352 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.987929 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3d6533117f72e796179f59fb24fc004ad6f646d1a32ed9c2052a3e7fc199247b WatchSource:0}: Error finding container 3d6533117f72e796179f59fb24fc004ad6f646d1a32ed9c2052a3e7fc199247b: Status 404 returned error can't find the container with id 3d6533117f72e796179f59fb24fc004ad6f646d1a32ed9c2052a3e7fc199247b Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.988636 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d6c989f1603858791ca7cdc8cb889bdfb3d3e4267b9f4229560e18f93a46364b WatchSource:0}: Error finding container d6c989f1603858791ca7cdc8cb889bdfb3d3e4267b9f4229560e18f93a46364b: Status 404 returned error can't find the container with id d6c989f1603858791ca7cdc8cb889bdfb3d3e4267b9f4229560e18f93a46364b Dec 16 14:56:59 crc kubenswrapper[4728]: I1216 14:56:59.989976 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:56:59 crc kubenswrapper[4728]: W1216 14:56:59.994889 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-c67db83b3452da86051a7f8b1f92e8f54ae9e3b99eb31bd666259aea057a9ecd WatchSource:0}: Error finding container c67db83b3452da86051a7f8b1f92e8f54ae9e3b99eb31bd666259aea057a9ecd: Status 404 returned error can't find the container with id c67db83b3452da86051a7f8b1f92e8f54ae9e3b99eb31bd666259aea057a9ecd Dec 16 14:57:00 crc kubenswrapper[4728]: W1216 14:57:00.008169 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4aab830785f6e1b1498b71f16b1f753c7599910495484027fef5b5b0120eb549 WatchSource:0}: Error finding container 4aab830785f6e1b1498b71f16b1f753c7599910495484027fef5b5b0120eb549: Status 404 returned error can't find the container with id 4aab830785f6e1b1498b71f16b1f753c7599910495484027fef5b5b0120eb549 Dec 16 14:57:00 crc kubenswrapper[4728]: W1216 14:57:00.019554 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-1eee8a335e6adf51b7f42dd30afae2ebaa78004da84541e8a490044898a46151 WatchSource:0}: Error finding container 1eee8a335e6adf51b7f42dd30afae2ebaa78004da84541e8a490044898a46151: Status 404 returned error can't find the container with id 1eee8a335e6adf51b7f42dd30afae2ebaa78004da84541e8a490044898a46151 Dec 16 14:57:00 crc kubenswrapper[4728]: E1216 14:57:00.041282 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="800ms" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.280059 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.282004 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.282061 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.282072 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.282107 4728 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 14:57:00 crc kubenswrapper[4728]: E1216 14:57:00.282683 4728 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.210:6443: connect: connection refused" node="crc" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.434015 4728 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.437716 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 02:41:45.935856373 +0000 UTC Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.437815 4728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 83h44m45.498048319s for next certificate rotation Dec 16 14:57:00 crc kubenswrapper[4728]: W1216 14:57:00.438039 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Dec 16 14:57:00 crc kubenswrapper[4728]: E1216 14:57:00.438139 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.512443 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775"} Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.512607 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1eee8a335e6adf51b7f42dd30afae2ebaa78004da84541e8a490044898a46151"} Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.512750 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.514680 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.514715 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.514729 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.520483 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d"} Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.520566 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4aab830785f6e1b1498b71f16b1f753c7599910495484027fef5b5b0120eb549"} Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.523806 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2"} Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.523891 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c67db83b3452da86051a7f8b1f92e8f54ae9e3b99eb31bd666259aea057a9ecd"} Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.524074 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.526013 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.526065 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.526085 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.527082 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20"} Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.527114 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d6c989f1603858791ca7cdc8cb889bdfb3d3e4267b9f4229560e18f93a46364b"} Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.527240 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.527932 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.527964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.527975 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.529377 4728 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732" exitCode=0 Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.529426 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732"} Dec 16 14:57:00 crc kubenswrapper[4728]: I1216 14:57:00.529446 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3d6533117f72e796179f59fb24fc004ad6f646d1a32ed9c2052a3e7fc199247b"} Dec 16 14:57:00 crc kubenswrapper[4728]: W1216 14:57:00.679982 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Dec 16 14:57:00 crc kubenswrapper[4728]: E1216 14:57:00.680159 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:57:00 crc kubenswrapper[4728]: W1216 14:57:00.829230 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Dec 16 14:57:00 crc kubenswrapper[4728]: E1216 14:57:00.829370 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:57:00 crc kubenswrapper[4728]: E1216 14:57:00.843154 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="1.6s" Dec 16 14:57:00 crc kubenswrapper[4728]: W1216 14:57:00.997729 4728 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Dec 16 14:57:00 crc kubenswrapper[4728]: E1216 14:57:00.997866 4728 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.083546 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.085761 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.085817 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.085837 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.085882 4728 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 14:57:01 crc kubenswrapper[4728]: E1216 14:57:01.086564 4728 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.210:6443: connect: connection refused" node="crc" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.391471 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 16 14:57:01 crc kubenswrapper[4728]: E1216 14:57:01.392968 4728 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.433494 4728 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.534092 4728 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775" exitCode=0 Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.534176 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775"} Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.534298 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.538963 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.539001 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.539018 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.544755 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c"} Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.544851 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127"} Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.544865 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.544884 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d"} Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.546318 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.546367 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.546393 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.547878 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2" exitCode=0 Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.547949 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2"} Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.548202 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.549479 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.549542 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.549563 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.551044 4728 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20" exitCode=0 Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.551079 4728 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7" exitCode=0 Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.551125 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20"} Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.551172 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.551181 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7"} Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.551325 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.552458 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.552500 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.552518 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.552712 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.552783 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.552804 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.556553 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.557774 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.557833 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:01 crc kubenswrapper[4728]: I1216 14:57:01.557862 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.556813 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b"} Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.556880 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e"} Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.556900 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6"} Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.557069 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.558600 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.558648 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.558659 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.562067 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2"} Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.562097 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90"} Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.562111 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b"} Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.562123 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14"} Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.564459 4728 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c" exitCode=0 Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.564503 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c"} Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.564803 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.569021 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.569142 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"09e18726ab68389800cdf480fc5d00a391d68a7aa384b1656b3fa1bc78e74930"} Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.569491 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.569694 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.569734 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.569751 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.569967 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.569997 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.570015 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.571311 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.571353 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.571367 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.687368 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.688835 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.688911 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.688929 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:02 crc kubenswrapper[4728]: I1216 14:57:02.688970 4728 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 16 14:57:03 crc kubenswrapper[4728]: I1216 14:57:03.581861 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2"} Dec 16 14:57:03 crc kubenswrapper[4728]: I1216 14:57:03.581944 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:03 crc kubenswrapper[4728]: I1216 14:57:03.583708 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:03 crc kubenswrapper[4728]: I1216 14:57:03.583770 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:03 crc kubenswrapper[4728]: I1216 14:57:03.583787 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:03 crc kubenswrapper[4728]: I1216 14:57:03.590116 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002"} Dec 16 14:57:03 crc kubenswrapper[4728]: I1216 14:57:03.590195 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f"} Dec 16 14:57:03 crc kubenswrapper[4728]: I1216 14:57:03.590216 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95"} Dec 16 14:57:03 crc kubenswrapper[4728]: I1216 14:57:03.590234 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622"} Dec 16 14:57:03 crc kubenswrapper[4728]: I1216 14:57:03.612870 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.599481 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece"} Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.599575 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.599629 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.599555 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.601090 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.601146 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.601164 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.601196 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.601235 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.601253 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.936713 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.936975 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.939271 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.939333 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:04 crc kubenswrapper[4728]: I1216 14:57:04.939350 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.345536 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.356745 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.602736 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.603827 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.603904 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.610717 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.610776 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.610797 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.610877 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.610956 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.610877 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.611021 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.611042 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.610978 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:05 crc kubenswrapper[4728]: I1216 14:57:05.648062 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 16 14:57:06 crc kubenswrapper[4728]: I1216 14:57:06.606119 4728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 14:57:06 crc kubenswrapper[4728]: I1216 14:57:06.606210 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:06 crc kubenswrapper[4728]: I1216 14:57:06.607468 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:06 crc kubenswrapper[4728]: I1216 14:57:06.607541 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:06 crc kubenswrapper[4728]: I1216 14:57:06.607561 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:07 crc kubenswrapper[4728]: I1216 14:57:07.235365 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:57:07 crc kubenswrapper[4728]: I1216 14:57:07.235609 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:07 crc kubenswrapper[4728]: I1216 14:57:07.237719 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:07 crc kubenswrapper[4728]: I1216 14:57:07.237761 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:07 crc kubenswrapper[4728]: I1216 14:57:07.237773 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:08 crc kubenswrapper[4728]: I1216 14:57:08.101674 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:57:08 crc kubenswrapper[4728]: I1216 14:57:08.101934 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:08 crc kubenswrapper[4728]: I1216 14:57:08.104094 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:08 crc kubenswrapper[4728]: I1216 14:57:08.104170 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:08 crc kubenswrapper[4728]: I1216 14:57:08.104261 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:09 crc kubenswrapper[4728]: E1216 14:57:09.580285 4728 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 16 14:57:09 crc kubenswrapper[4728]: I1216 14:57:09.591377 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 16 14:57:09 crc kubenswrapper[4728]: I1216 14:57:09.591668 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:09 crc kubenswrapper[4728]: I1216 14:57:09.593035 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:09 crc kubenswrapper[4728]: I1216 14:57:09.593092 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:09 crc kubenswrapper[4728]: I1216 14:57:09.593111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:10 crc kubenswrapper[4728]: I1216 14:57:10.178700 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:57:10 crc kubenswrapper[4728]: I1216 14:57:10.178962 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:10 crc kubenswrapper[4728]: I1216 14:57:10.180679 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:10 crc kubenswrapper[4728]: I1216 14:57:10.180735 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:10 crc kubenswrapper[4728]: I1216 14:57:10.180759 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:10 crc kubenswrapper[4728]: I1216 14:57:10.184469 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:57:10 crc kubenswrapper[4728]: I1216 14:57:10.618910 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:10 crc kubenswrapper[4728]: I1216 14:57:10.620769 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:10 crc kubenswrapper[4728]: I1216 14:57:10.620811 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:10 crc kubenswrapper[4728]: I1216 14:57:10.620821 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:11 crc kubenswrapper[4728]: I1216 14:57:11.636991 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 16 14:57:11 crc kubenswrapper[4728]: I1216 14:57:11.637329 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:11 crc kubenswrapper[4728]: I1216 14:57:11.639071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:11 crc kubenswrapper[4728]: I1216 14:57:11.639129 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:11 crc kubenswrapper[4728]: I1216 14:57:11.639142 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:12 crc kubenswrapper[4728]: I1216 14:57:12.005062 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:57:12 crc kubenswrapper[4728]: I1216 14:57:12.008456 4728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 16 14:57:12 crc kubenswrapper[4728]: I1216 14:57:12.010503 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:12 crc kubenswrapper[4728]: I1216 14:57:12.010563 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:12 crc kubenswrapper[4728]: I1216 14:57:12.010586 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:12 crc kubenswrapper[4728]: I1216 14:57:12.433479 4728 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 16 14:57:12 crc kubenswrapper[4728]: E1216 14:57:12.444880 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 16 14:57:12 crc kubenswrapper[4728]: I1216 14:57:12.712368 4728 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 16 14:57:12 crc kubenswrapper[4728]: I1216 14:57:12.712490 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 16 14:57:12 crc kubenswrapper[4728]: I1216 14:57:12.720334 4728 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 16 14:57:12 crc kubenswrapper[4728]: I1216 14:57:12.720468 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 16 14:57:13 crc kubenswrapper[4728]: I1216 14:57:13.115246 4728 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 14:57:13 crc kubenswrapper[4728]: I1216 14:57:13.115725 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 14:57:13 crc kubenswrapper[4728]: I1216 14:57:13.179229 4728 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 14:57:13 crc kubenswrapper[4728]: I1216 14:57:13.179370 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 14:57:13 crc kubenswrapper[4728]: I1216 14:57:13.621387 4728 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]log ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]etcd ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/generic-apiserver-start-informers ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/priority-and-fairness-filter ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/start-apiextensions-informers ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/start-apiextensions-controllers ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/crd-informer-synced ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/start-system-namespaces-controller ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 16 14:57:13 crc kubenswrapper[4728]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 16 14:57:13 crc kubenswrapper[4728]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/bootstrap-controller ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/start-kube-aggregator-informers ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/apiservice-registration-controller ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/apiservice-discovery-controller ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]autoregister-completion ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/apiservice-openapi-controller ok Dec 16 14:57:13 crc kubenswrapper[4728]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 16 14:57:13 crc kubenswrapper[4728]: livez check failed Dec 16 14:57:13 crc kubenswrapper[4728]: I1216 14:57:13.621527 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.694551 4728 trace.go:236] Trace[759798198]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 14:57:03.447) (total time: 14247ms): Dec 16 14:57:17 crc kubenswrapper[4728]: Trace[759798198]: ---"Objects listed" error: 14247ms (14:57:17.694) Dec 16 14:57:17 crc kubenswrapper[4728]: Trace[759798198]: [14.247076726s] [14.247076726s] END Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.694618 4728 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.695346 4728 trace.go:236] Trace[1652896874]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 14:57:03.270) (total time: 14425ms): Dec 16 14:57:17 crc kubenswrapper[4728]: Trace[1652896874]: ---"Objects listed" error: 14425ms (14:57:17.695) Dec 16 14:57:17 crc kubenswrapper[4728]: Trace[1652896874]: [14.425255807s] [14.425255807s] END Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.695378 4728 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.695949 4728 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.696315 4728 trace.go:236] Trace[1434518214]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 14:57:03.120) (total time: 14575ms): Dec 16 14:57:17 crc kubenswrapper[4728]: Trace[1434518214]: ---"Objects listed" error: 14575ms (14:57:17.696) Dec 16 14:57:17 crc kubenswrapper[4728]: Trace[1434518214]: [14.575401958s] [14.575401958s] END Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.696334 4728 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.730931 4728 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.831992 4728 trace.go:236] Trace[1898927835]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Dec-2025 14:57:03.797) (total time: 14034ms): Dec 16 14:57:17 crc kubenswrapper[4728]: Trace[1898927835]: ---"Objects listed" error: 14034ms (14:57:17.831) Dec 16 14:57:17 crc kubenswrapper[4728]: Trace[1898927835]: [14.034391205s] [14.034391205s] END Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.832042 4728 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.838681 4728 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.839071 4728 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.842272 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.842446 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.842480 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.842527 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.842553 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:17Z","lastTransitionTime":"2025-12-16T14:57:17Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Dec 16 14:57:17 crc kubenswrapper[4728]: E1216 14:57:17.886129 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.894105 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.894157 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.894172 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.894198 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.894218 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:17Z","lastTransitionTime":"2025-12-16T14:57:17Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Dec 16 14:57:17 crc kubenswrapper[4728]: E1216 14:57:17.912792 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.918479 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.918517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.918528 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.918549 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.918561 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:17Z","lastTransitionTime":"2025-12-16T14:57:17Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Dec 16 14:57:17 crc kubenswrapper[4728]: E1216 14:57:17.934460 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.938462 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.938493 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.938505 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.938524 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.938536 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:17Z","lastTransitionTime":"2025-12-16T14:57:17Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Dec 16 14:57:17 crc kubenswrapper[4728]: E1216 14:57:17.949244 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.952971 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.953034 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.953049 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.953075 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.953101 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:17Z","lastTransitionTime":"2025-12-16T14:57:17Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Dec 16 14:57:17 crc kubenswrapper[4728]: E1216 14:57:17.972080 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:17Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:17 crc kubenswrapper[4728]: E1216 14:57:17.972221 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.977749 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.977796 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.977812 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.977837 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:17 crc kubenswrapper[4728]: I1216 14:57:17.977851 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:17Z","lastTransitionTime":"2025-12-16T14:57:17Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.080718 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.080777 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.080790 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.080816 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.080833 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:18Z","lastTransitionTime":"2025-12-16T14:57:18Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.183373 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.183420 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.183450 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.183472 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.183484 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:18Z","lastTransitionTime":"2025-12-16T14:57:18Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.286343 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.286465 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.286486 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.286954 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.287020 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:18Z","lastTransitionTime":"2025-12-16T14:57:18Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.391341 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.391389 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.391400 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.391446 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.391458 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:18Z","lastTransitionTime":"2025-12-16T14:57:18Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.433868 4728 apiserver.go:52] "Watching apiserver" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.438643 4728 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.439175 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.439879 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.440184 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.440310 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.440341 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.440388 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.440573 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.440669 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.440716 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.441045 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.443079 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.443278 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.443083 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.443481 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.444013 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.445339 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.445390 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.445499 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.446510 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.471340 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.487810 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.494075 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.494135 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.494182 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.494213 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.494230 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:18Z","lastTransitionTime":"2025-12-16T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.507641 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.524577 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.539952 4728 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.543295 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.560279 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.572527 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.586568 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.598158 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.598227 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.598247 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.598274 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.598292 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:18Z","lastTransitionTime":"2025-12-16T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.601703 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.601783 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.601822 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.601857 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.601887 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.601919 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.601953 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.601984 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602055 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602088 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602120 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602152 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602186 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602228 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602259 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602292 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602325 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602359 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602367 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602392 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602490 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602538 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602577 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602609 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602599 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602635 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602662 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602687 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602712 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602741 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602762 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602794 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602762 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602827 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602853 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602882 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602915 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602943 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602980 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603009 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603032 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603090 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603160 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603190 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603217 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603241 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603270 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603296 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603322 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603394 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603423 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603464 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603488 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603515 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603543 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603566 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603589 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603614 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603636 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603697 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603721 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603749 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603774 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603797 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603829 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603855 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603879 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603903 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603927 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603951 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603977 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604056 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604087 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604119 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604144 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604166 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604189 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604214 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604243 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604271 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604295 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604317 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604341 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604364 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604392 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604421 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604460 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604485 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604510 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604547 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604570 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604596 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604678 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604703 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604731 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604755 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604780 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604811 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604837 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604861 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604925 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604953 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604985 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605013 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605044 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605083 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605144 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605170 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605195 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605224 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605255 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605282 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605304 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602762 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602924 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605291 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.602950 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603232 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603240 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603266 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603700 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603743 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603784 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.603869 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604096 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604094 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604154 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.604264 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605295 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605400 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605917 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605943 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.606783 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.606860 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.606873 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.606924 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.607063 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.607716 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.605329 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.607966 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.607830 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608013 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608015 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608058 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608066 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608098 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608180 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608308 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608446 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608509 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608616 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608661 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608703 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608753 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608824 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608878 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608873 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608926 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608968 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609050 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609196 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609241 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609280 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609323 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609364 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609401 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609507 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609578 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609630 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609686 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609734 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609784 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609833 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609881 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609999 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610114 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610170 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610219 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610275 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610342 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610390 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610481 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610534 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610616 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610677 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610731 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610783 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610839 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610890 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610937 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610991 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611043 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611093 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611147 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611201 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611262 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611322 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611377 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611502 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611563 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611619 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611671 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611729 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611789 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611844 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611895 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611942 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611996 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612051 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612103 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612152 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612206 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612259 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612307 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612358 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612415 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612503 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612563 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612616 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612668 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612721 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612776 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612831 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612889 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612946 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613004 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613058 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613145 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613200 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613252 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613300 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613391 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613502 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613561 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613614 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613675 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613728 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613781 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613844 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608607 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608820 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613898 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613957 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614020 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614075 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614135 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614193 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614307 4728 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614346 4728 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614383 4728 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614419 4728 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614476 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614548 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614581 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614608 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614634 4728 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614664 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614691 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614719 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614745 4728 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614770 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614797 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614823 4728 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614855 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614883 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614910 4728 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614937 4728 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614966 4728 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614994 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615022 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615051 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615082 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615107 4728 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615133 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615157 4728 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615186 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615210 4728 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615235 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615261 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615289 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615317 4728 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615346 4728 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615373 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615400 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.629519 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.630669 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608919 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609313 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609329 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.608271 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609570 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.609859 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610344 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610383 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.610965 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611052 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611343 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.611765 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612003 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612063 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612217 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612862 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.631121 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612888 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.612937 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613078 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613249 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.613851 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614259 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.614797 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615040 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615377 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615582 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.615618 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.631764 4728 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615949 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.615978 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.616069 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.616292 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.616650 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.616691 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.616708 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.616748 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.617343 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.617514 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.617530 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.617616 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.617744 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.617833 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.117788032 +0000 UTC m=+19.957967056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.632375 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.132330397 +0000 UTC m=+19.972509431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.617837 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.632562 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.617913 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.617801 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.617985 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.618153 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.618409 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.618571 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.618642 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.618606 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.618716 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.618754 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.618793 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.619283 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.619343 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.619362 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.619476 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.619619 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.620138 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.620794 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.621092 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.621203 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.621223 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.622409 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.622496 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.622825 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.623684 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.623749 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.624390 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.624387 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.624964 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.625049 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.626259 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.626369 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.626826 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.626933 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.627188 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.627331 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.627640 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.627750 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.627769 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.628223 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.628398 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.629001 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.628442 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.630414 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.633269 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.633368 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.634154 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.134109546 +0000 UTC m=+19.974288560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.634641 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.635010 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.635010 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.635343 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.635464 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.636009 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.636642 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.636801 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.636850 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.637531 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.637572 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.637666 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.637745 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.637739 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.637792 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.637801 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.638017 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.638250 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.638296 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.641922 4728 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.641978 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.644514 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.645833 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.649221 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.650170 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.650530 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.650580 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.650636 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.650803 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.150777579 +0000 UTC m=+19.990956573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.651916 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.653627 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.655776 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2" exitCode=255 Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.655991 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2"} Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.662151 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.662627 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.662673 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.663856 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.664225 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.663698 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.665943 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.666006 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.666189 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.666204 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:18 crc kubenswrapper[4728]: E1216 14:57:18.666625 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:19.166583848 +0000 UTC m=+20.006762842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.667793 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.669989 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.670996 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.671229 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.671251 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.671650 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.673042 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.676201 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.676444 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.677180 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.677783 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.678079 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.678939 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.678964 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.679220 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.681164 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.681914 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.682578 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.683150 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.683174 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.683271 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.683560 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.683644 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.683833 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.683959 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.684059 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.684292 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.684676 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.684791 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.685015 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.685355 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.685460 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.685492 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.685594 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.686047 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.686167 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.686202 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.686282 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.686308 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.686346 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.686570 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.686721 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.686856 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.687846 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.687954 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.690732 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.702273 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.704917 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.704959 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.704973 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.704997 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.705011 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:18Z","lastTransitionTime":"2025-12-16T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.716209 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.716484 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.716609 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.716713 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.716819 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:57:18 crc kubenswrapper[4728]: W1216 14:57:18.717084 4728 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717148 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717341 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717406 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717237 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717477 4728 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717491 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717503 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717515 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717526 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717539 4728 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717550 4728 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717562 4728 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717575 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717587 4728 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717601 4728 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717614 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717628 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717639 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717651 4728 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717663 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717677 4728 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717691 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717703 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717718 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717765 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717781 4728 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717795 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717808 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717820 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717833 4728 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717845 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717856 4728 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717869 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717881 4728 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717897 4728 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717911 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717925 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717939 4728 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717952 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717966 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717979 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.717992 4728 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718004 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718016 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718028 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718044 4728 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718056 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718070 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718083 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718095 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718107 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718119 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718131 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718143 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718154 4728 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718167 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718181 4728 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718193 4728 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718206 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718218 4728 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718230 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718244 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718256 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718269 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718280 4728 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718291 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718301 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718314 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718326 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718337 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718350 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718361 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718376 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718387 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718398 4728 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718413 4728 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718424 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718454 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718466 4728 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718476 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718487 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718499 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718511 4728 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718495 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718523 4728 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718614 4728 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718638 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718655 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718670 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718684 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718700 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718716 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718732 4728 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718750 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718764 4728 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718780 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718796 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718810 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718824 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718839 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718853 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718867 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718880 4728 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718892 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718906 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718922 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718933 4728 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718946 4728 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718959 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718972 4728 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718984 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.718997 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719011 4728 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719023 4728 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719037 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719050 4728 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719065 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719080 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719092 4728 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719106 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719121 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719135 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719149 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719164 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719180 4728 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719196 4728 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719209 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719224 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719240 4728 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719254 4728 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719268 4728 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719281 4728 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719295 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719310 4728 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719323 4728 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719339 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719352 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719367 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719381 4728 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719396 4728 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719414 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719426 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719455 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719467 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719480 4728 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719496 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719510 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719587 4728 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719601 4728 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719615 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719657 4728 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719670 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719685 4728 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719699 4728 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719711 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719744 4728 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.719757 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.721394 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.725130 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.726517 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.731885 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.741252 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.754228 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.763811 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.765909 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.775858 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.779182 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.789520 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: W1216 14:57:18.791048 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-fc142a728e36b641a9697ed970124b36847d937006e71710469e21d16022b3f7 WatchSource:0}: Error finding container fc142a728e36b641a9697ed970124b36847d937006e71710469e21d16022b3f7: Status 404 returned error can't find the container with id fc142a728e36b641a9697ed970124b36847d937006e71710469e21d16022b3f7 Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.792860 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.801817 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.807656 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.807700 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.807712 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.807732 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.807745 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:18Z","lastTransitionTime":"2025-12-16T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:18 crc kubenswrapper[4728]: W1216 14:57:18.812960 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-8448646d8d45972170a0419dadce5ff68433f7b6810b41535246ef0759a1be05 WatchSource:0}: Error finding container 8448646d8d45972170a0419dadce5ff68433f7b6810b41535246ef0759a1be05: Status 404 returned error can't find the container with id 8448646d8d45972170a0419dadce5ff68433f7b6810b41535246ef0759a1be05 Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.815868 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.820756 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.820794 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.820832 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.829473 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.910750 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.910801 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.910815 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.910833 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:18 crc kubenswrapper[4728]: I1216 14:57:18.910850 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:18Z","lastTransitionTime":"2025-12-16T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.013898 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.013961 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.013981 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.014009 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.014030 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:19Z","lastTransitionTime":"2025-12-16T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.117020 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.117060 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.117069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.117090 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.117099 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:19Z","lastTransitionTime":"2025-12-16T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.123913 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.124206 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.124175948 +0000 UTC m=+20.964354932 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.220359 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.220442 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.220455 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.220479 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.220494 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:19Z","lastTransitionTime":"2025-12-16T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.224894 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.224973 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.225012 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.225052 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225140 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225163 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225207 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225266 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225333 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225356 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225175 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225461 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225281 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.225259185 +0000 UTC m=+21.065438169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225553 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.225530832 +0000 UTC m=+21.065709966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225577 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.225563383 +0000 UTC m=+21.065742557 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.225600 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:20.225589834 +0000 UTC m=+21.065769038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.322795 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.322864 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.322883 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.322907 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.322921 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:19Z","lastTransitionTime":"2025-12-16T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.425433 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.425495 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.425513 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.425536 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.425551 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:19Z","lastTransitionTime":"2025-12-16T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.506261 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:19 crc kubenswrapper[4728]: E1216 14:57:19.506600 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.515261 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.515979 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.518340 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.519167 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.519946 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.520603 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.521215 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.521968 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.522901 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.523584 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.524236 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.525137 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.525875 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.528224 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.528256 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.528396 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.528421 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.528440 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.528453 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:19Z","lastTransitionTime":"2025-12-16T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.529527 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.531652 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.533235 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.534161 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.536086 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.537253 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.538224 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.539705 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.542709 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.543565 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.544479 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.545852 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.546707 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.548747 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.549645 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.551131 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.551663 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.552275 4728 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.552895 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.554711 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.555379 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.556304 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.558000 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.558742 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.559794 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.560514 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.561716 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.562258 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.563286 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.563459 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.564092 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.565178 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.565919 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.567189 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.567763 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.569114 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.569707 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.570573 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.571118 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.572090 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.572790 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.573300 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.586949 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.605955 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.623865 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.631508 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.631557 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.631569 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.631590 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.631603 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:19Z","lastTransitionTime":"2025-12-16T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.642422 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.657595 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.660468 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.660556 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bc92ccd14403b99882663e3a5e1aeab0e341edafd5b70a6aba149b462436070b"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.661749 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8448646d8d45972170a0419dadce5ff68433f7b6810b41535246ef0759a1be05"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.663784 4728 scope.go:117] "RemoveContainer" containerID="b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.664101 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.664131 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.664144 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fc142a728e36b641a9697ed970124b36847d937006e71710469e21d16022b3f7"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.682955 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.701491 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.715934 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.735587 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.735673 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.735699 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.735732 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.735758 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:19Z","lastTransitionTime":"2025-12-16T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.739662 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.754930 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.777617 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.793439 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.820316 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.838352 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.838392 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.838415 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.838432 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.838444 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:19Z","lastTransitionTime":"2025-12-16T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.838804 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.858330 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.873976 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.896368 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.911823 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.934581 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.942237 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.942294 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.942308 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.942333 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:19 crc kubenswrapper[4728]: I1216 14:57:19.942346 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:19Z","lastTransitionTime":"2025-12-16T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.045493 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.045538 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.045549 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.045567 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.045581 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:20Z","lastTransitionTime":"2025-12-16T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.134273 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.134547 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:22.134508811 +0000 UTC m=+22.974687835 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.148158 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.148212 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.148226 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.148245 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.148257 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:20Z","lastTransitionTime":"2025-12-16T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.193073 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.200494 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.210271 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.212543 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.229891 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.235622 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.235696 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.235747 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.235797 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.235802 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.235930 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:22.235892977 +0000 UTC m=+23.076072151 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.235938 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.235972 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.235964 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.236105 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:22.236071281 +0000 UTC m=+23.076250285 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.235993 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.236178 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:22.236168845 +0000 UTC m=+23.076347839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.236341 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.236359 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.236369 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.236460 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:22.236446052 +0000 UTC m=+23.076625036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.245871 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.251525 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.251601 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.251625 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.251653 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.251672 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:20Z","lastTransitionTime":"2025-12-16T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.261920 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.273594 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.286844 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.301975 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.318688 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.330237 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.348826 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.354077 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.354134 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.354148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.354168 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.354182 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:20Z","lastTransitionTime":"2025-12-16T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.368840 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.383607 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.397531 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.413861 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.434383 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.456931 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.456979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.456987 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.457004 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.457014 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:20Z","lastTransitionTime":"2025-12-16T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.505729 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.505756 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.505932 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:20 crc kubenswrapper[4728]: E1216 14:57:20.506122 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.561638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.561726 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.561751 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.561786 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.561805 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:20Z","lastTransitionTime":"2025-12-16T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.664506 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.664543 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.664553 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.664571 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.664582 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:20Z","lastTransitionTime":"2025-12-16T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.667802 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.670045 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687"} Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.670779 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.689383 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.706753 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.723766 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.742124 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.766804 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.767955 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.768026 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.768044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.768073 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.768092 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:20Z","lastTransitionTime":"2025-12-16T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.792874 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.815629 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.835339 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.875078 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.875157 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.875175 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.875210 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.875227 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:20Z","lastTransitionTime":"2025-12-16T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.979212 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.979266 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.979275 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.979292 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:20 crc kubenswrapper[4728]: I1216 14:57:20.979302 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:20Z","lastTransitionTime":"2025-12-16T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.082162 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.082246 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.082273 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.082304 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.082322 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:21Z","lastTransitionTime":"2025-12-16T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.186589 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.186675 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.186699 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.186734 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.186756 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:21Z","lastTransitionTime":"2025-12-16T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.289781 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.289834 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.289844 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.289865 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.289885 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:21Z","lastTransitionTime":"2025-12-16T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.393233 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.393313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.393329 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.393358 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.393377 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:21Z","lastTransitionTime":"2025-12-16T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.496940 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.496996 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.497012 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.497039 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.497059 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:21Z","lastTransitionTime":"2025-12-16T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.506584 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:21 crc kubenswrapper[4728]: E1216 14:57:21.506791 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.611000 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.611067 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.611081 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.611101 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.611115 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:21Z","lastTransitionTime":"2025-12-16T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.676578 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.694521 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.698171 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.699480 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.714004 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.714280 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.714552 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.714798 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.715009 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:21Z","lastTransitionTime":"2025-12-16T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.722195 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.738211 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.753254 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.773834 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.794455 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.812462 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.818143 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.818203 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.818225 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.818252 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.818266 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:21Z","lastTransitionTime":"2025-12-16T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.831837 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.853757 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.884792 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.900105 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.914915 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.920942 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.920985 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.921009 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.921032 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.921044 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:21Z","lastTransitionTime":"2025-12-16T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.928570 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.953782 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:21 crc kubenswrapper[4728]: I1216 14:57:21.985161 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.010691 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.023596 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.023650 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.023663 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.023684 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.023700 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:22Z","lastTransitionTime":"2025-12-16T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.025399 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.126804 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.126868 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.126887 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.126917 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.126935 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:22Z","lastTransitionTime":"2025-12-16T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.153530 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.153870 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:26.153822932 +0000 UTC m=+26.994001946 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.230325 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.230374 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.230383 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.230400 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.230434 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:22Z","lastTransitionTime":"2025-12-16T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.255139 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.255222 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.255266 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.255303 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255457 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255469 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255530 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255552 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255545 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255569 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255638 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:26.255609269 +0000 UTC m=+27.095788283 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255706 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:26.25566369 +0000 UTC m=+27.095842744 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255490 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255749 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:26.255730332 +0000 UTC m=+27.095909356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255791 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.255868 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:26.255839765 +0000 UTC m=+27.096018789 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.334511 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.334581 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.334592 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.334614 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.334627 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:22Z","lastTransitionTime":"2025-12-16T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.438614 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.438690 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.438711 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.438742 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.438761 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:22Z","lastTransitionTime":"2025-12-16T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.505765 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.505896 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.506014 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.506286 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.542628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.542692 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.542709 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.542735 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.542753 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:22Z","lastTransitionTime":"2025-12-16T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.645734 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.645844 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.645872 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.645989 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.646098 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:22Z","lastTransitionTime":"2025-12-16T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.678201 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3"} Dec 16 14:57:22 crc kubenswrapper[4728]: E1216 14:57:22.699157 4728 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.702968 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.722622 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.742738 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.748865 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.748917 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.748934 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.748956 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.748972 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:22Z","lastTransitionTime":"2025-12-16T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.772313 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.790255 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.802879 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.817789 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.841358 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.851148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.851177 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.851185 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.851199 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.851208 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:22Z","lastTransitionTime":"2025-12-16T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.857910 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.953732 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.953785 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.953795 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.953811 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:22 crc kubenswrapper[4728]: I1216 14:57:22.953822 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:22Z","lastTransitionTime":"2025-12-16T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.056332 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.056371 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.056380 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.056395 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.056407 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:23Z","lastTransitionTime":"2025-12-16T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.107270 4728 csr.go:261] certificate signing request csr-2lwwd is approved, waiting to be issued Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.116902 4728 csr.go:257] certificate signing request csr-2lwwd is issued Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.159129 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.159168 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.159178 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.159194 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.159207 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:23Z","lastTransitionTime":"2025-12-16T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.261617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.261649 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.261662 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.261677 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.261687 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:23Z","lastTransitionTime":"2025-12-16T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.364590 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.364641 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.364655 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.364675 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.364686 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:23Z","lastTransitionTime":"2025-12-16T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.467452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.467543 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.467557 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.467580 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.467591 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:23Z","lastTransitionTime":"2025-12-16T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.506136 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:23 crc kubenswrapper[4728]: E1216 14:57:23.506303 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.570088 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.570153 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.570169 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.570190 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.570202 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:23Z","lastTransitionTime":"2025-12-16T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.672855 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.672905 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.672916 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.672934 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.672946 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:23Z","lastTransitionTime":"2025-12-16T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.779651 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.779717 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.779733 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.779757 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.779772 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:23Z","lastTransitionTime":"2025-12-16T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.882166 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.882211 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.882220 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.882237 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.882249 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:23Z","lastTransitionTime":"2025-12-16T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.962738 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6lqf6"] Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.963148 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6lqf6" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.966076 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.966119 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.966270 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.966404 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-njzmx"] Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.966754 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bdpsg"] Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.966961 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.966981 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bdpsg" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.968563 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.970567 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.970668 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.971436 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.971692 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.971734 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.971765 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.971783 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.971780 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.972059 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.980959 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.984543 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.984581 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.984593 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.984611 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.984621 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:23Z","lastTransitionTime":"2025-12-16T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:23 crc kubenswrapper[4728]: I1216 14:57:23.996807 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.010212 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.022145 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.038382 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.057750 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.072895 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-cnibin\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.072952 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj8dq\" (UniqueName: \"kubernetes.io/projected/d5cdc17e-067e-4d74-b768-02966221d3ae-kube-api-access-jj8dq\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073168 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-daemon-config\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073190 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-run-multus-certs\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073346 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-run-netns\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073463 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/57f7e48b-7353-469c-ab9d-7f966c08d5f1-cni-binary-copy\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073485 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-run-k8s-cni-cncf-io\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073516 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8f7f\" (UniqueName: \"kubernetes.io/projected/57f7e48b-7353-469c-ab9d-7f966c08d5f1-kube-api-access-k8f7f\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073534 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-var-lib-cni-multus\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073551 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-etc-kubernetes\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073764 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-cni-dir\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073855 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5cdc17e-067e-4d74-b768-02966221d3ae-proxy-tls\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073898 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-socket-dir-parent\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073872 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.073946 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-var-lib-cni-bin\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.074189 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbk27\" (UniqueName: \"kubernetes.io/projected/e13f8ca5-bf05-4740-be7d-81af5e57172b-kube-api-access-sbk27\") pod \"node-resolver-6lqf6\" (UID: \"e13f8ca5-bf05-4740-be7d-81af5e57172b\") " pod="openshift-dns/node-resolver-6lqf6" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.074227 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d5cdc17e-067e-4d74-b768-02966221d3ae-rootfs\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.074267 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d5cdc17e-067e-4d74-b768-02966221d3ae-mcd-auth-proxy-config\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.074343 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-conf-dir\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.074377 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e13f8ca5-bf05-4740-be7d-81af5e57172b-hosts-file\") pod \"node-resolver-6lqf6\" (UID: \"e13f8ca5-bf05-4740-be7d-81af5e57172b\") " pod="openshift-dns/node-resolver-6lqf6" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.074436 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-system-cni-dir\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.074469 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-var-lib-kubelet\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.074518 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-os-release\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.074550 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-hostroot\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.088133 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.088174 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.088184 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.088203 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.088217 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:24Z","lastTransitionTime":"2025-12-16T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.091259 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.107825 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.118013 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-16 14:52:23 +0000 UTC, rotation deadline is 2026-10-22 01:25:37.430130654 +0000 UTC Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.118085 4728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7426h28m13.312050697s for next certificate rotation Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.127753 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.150957 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.165482 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.175940 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-daemon-config\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.175998 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-run-multus-certs\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176044 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-run-netns\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176064 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/57f7e48b-7353-469c-ab9d-7f966c08d5f1-cni-binary-copy\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176088 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-run-k8s-cni-cncf-io\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176107 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8f7f\" (UniqueName: \"kubernetes.io/projected/57f7e48b-7353-469c-ab9d-7f966c08d5f1-kube-api-access-k8f7f\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176138 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-var-lib-cni-multus\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176165 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-etc-kubernetes\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176199 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-cni-dir\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176215 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5cdc17e-067e-4d74-b768-02966221d3ae-proxy-tls\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176235 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-socket-dir-parent\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176252 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-var-lib-cni-bin\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176287 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbk27\" (UniqueName: \"kubernetes.io/projected/e13f8ca5-bf05-4740-be7d-81af5e57172b-kube-api-access-sbk27\") pod \"node-resolver-6lqf6\" (UID: \"e13f8ca5-bf05-4740-be7d-81af5e57172b\") " pod="openshift-dns/node-resolver-6lqf6" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176305 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d5cdc17e-067e-4d74-b768-02966221d3ae-rootfs\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176325 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d5cdc17e-067e-4d74-b768-02966221d3ae-mcd-auth-proxy-config\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176346 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-conf-dir\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176363 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e13f8ca5-bf05-4740-be7d-81af5e57172b-hosts-file\") pod \"node-resolver-6lqf6\" (UID: \"e13f8ca5-bf05-4740-be7d-81af5e57172b\") " pod="openshift-dns/node-resolver-6lqf6" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176381 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-system-cni-dir\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176431 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-var-lib-kubelet\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176479 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-os-release\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176495 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-hostroot\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176531 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-cnibin\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176547 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj8dq\" (UniqueName: \"kubernetes.io/projected/d5cdc17e-067e-4d74-b768-02966221d3ae-kube-api-access-jj8dq\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176956 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-daemon-config\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176985 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-run-netns\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.176963 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-run-multus-certs\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.177279 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-run-k8s-cni-cncf-io\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.177487 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-var-lib-cni-multus\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.177529 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-etc-kubernetes\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.177617 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/57f7e48b-7353-469c-ab9d-7f966c08d5f1-cni-binary-copy\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.177667 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d5cdc17e-067e-4d74-b768-02966221d3ae-rootfs\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.177699 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-cni-dir\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.178190 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d5cdc17e-067e-4d74-b768-02966221d3ae-mcd-auth-proxy-config\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.178227 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-conf-dir\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.178280 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e13f8ca5-bf05-4740-be7d-81af5e57172b-hosts-file\") pod \"node-resolver-6lqf6\" (UID: \"e13f8ca5-bf05-4740-be7d-81af5e57172b\") " pod="openshift-dns/node-resolver-6lqf6" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.178311 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-system-cni-dir\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.178339 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-var-lib-kubelet\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.178382 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-os-release\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.178409 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-hostroot\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.178455 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-cnibin\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.178483 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-host-var-lib-cni-bin\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.178582 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/57f7e48b-7353-469c-ab9d-7f966c08d5f1-multus-socket-dir-parent\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.182661 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d5cdc17e-067e-4d74-b768-02966221d3ae-proxy-tls\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.185264 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.191662 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.191704 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.191713 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.191732 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.191741 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:24Z","lastTransitionTime":"2025-12-16T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.203074 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8f7f\" (UniqueName: \"kubernetes.io/projected/57f7e48b-7353-469c-ab9d-7f966c08d5f1-kube-api-access-k8f7f\") pod \"multus-bdpsg\" (UID: \"57f7e48b-7353-469c-ab9d-7f966c08d5f1\") " pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.203411 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbk27\" (UniqueName: \"kubernetes.io/projected/e13f8ca5-bf05-4740-be7d-81af5e57172b-kube-api-access-sbk27\") pod \"node-resolver-6lqf6\" (UID: \"e13f8ca5-bf05-4740-be7d-81af5e57172b\") " pod="openshift-dns/node-resolver-6lqf6" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.210729 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj8dq\" (UniqueName: \"kubernetes.io/projected/d5cdc17e-067e-4d74-b768-02966221d3ae-kube-api-access-jj8dq\") pod \"machine-config-daemon-njzmx\" (UID: \"d5cdc17e-067e-4d74-b768-02966221d3ae\") " pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.213507 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.228556 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.241644 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.254067 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.267531 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.278140 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6lqf6" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.288586 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bdpsg" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.290132 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.294686 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.299795 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.299828 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.299840 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.299856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.299866 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:24Z","lastTransitionTime":"2025-12-16T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.308046 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: W1216 14:57:24.309154 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57f7e48b_7353_469c_ab9d_7f966c08d5f1.slice/crio-c385dcba9db9ddd4b77a49906419ec0fa4f8faf2e50f0671bfbf1f35f266be26 WatchSource:0}: Error finding container c385dcba9db9ddd4b77a49906419ec0fa4f8faf2e50f0671bfbf1f35f266be26: Status 404 returned error can't find the container with id c385dcba9db9ddd4b77a49906419ec0fa4f8faf2e50f0671bfbf1f35f266be26 Dec 16 14:57:24 crc kubenswrapper[4728]: W1216 14:57:24.319963 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5cdc17e_067e_4d74_b768_02966221d3ae.slice/crio-9838335202ee4a537c53f98eec7b4d24b2f3355b2d1ebf720d4da69891dbad3d WatchSource:0}: Error finding container 9838335202ee4a537c53f98eec7b4d24b2f3355b2d1ebf720d4da69891dbad3d: Status 404 returned error can't find the container with id 9838335202ee4a537c53f98eec7b4d24b2f3355b2d1ebf720d4da69891dbad3d Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.335889 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.346762 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9nv7n"] Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.347632 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.348065 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2458v"] Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.348945 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.355730 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.355898 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.357360 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.357474 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.357647 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.357698 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.358063 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.358343 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.358528 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.360029 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.375165 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.395797 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.402227 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.402270 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.402283 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.402302 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.402313 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:24Z","lastTransitionTime":"2025-12-16T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.413435 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.430914 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.443565 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.457175 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.469509 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.478989 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-os-release\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479057 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-log-socket\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479114 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-netd\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479142 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479186 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvx6k\" (UniqueName: \"kubernetes.io/projected/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-kube-api-access-mvx6k\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479229 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-system-cni-dir\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479255 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-cni-binary-copy\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479288 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-cnibin\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479319 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479356 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-etc-openvswitch\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479385 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-ovn\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479449 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-bin\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479477 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-config\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479633 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-var-lib-openvswitch\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479723 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479767 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-openvswitch\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479827 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-node-log\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479894 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwj7t\" (UniqueName: \"kubernetes.io/projected/480f8c1b-60cc-4685-86cc-a457f645e87c-kube-api-access-jwj7t\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479925 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-kubelet\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479956 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-systemd\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.479984 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/480f8c1b-60cc-4685-86cc-a457f645e87c-ovn-node-metrics-cert\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.480131 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-slash\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.480221 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-netns\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.480319 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-systemd-units\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.480363 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-script-lib\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.480431 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.480519 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-env-overrides\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.488931 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.504795 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.504832 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.504843 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.504860 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.504872 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:24Z","lastTransitionTime":"2025-12-16T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.505337 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:24 crc kubenswrapper[4728]: E1216 14:57:24.505456 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.505719 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:24 crc kubenswrapper[4728]: E1216 14:57:24.505774 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.517328 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.534685 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.550654 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.567160 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581077 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-bin\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581116 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-config\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581134 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-var-lib-openvswitch\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581152 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581168 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-openvswitch\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581183 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-node-log\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581200 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwj7t\" (UniqueName: \"kubernetes.io/projected/480f8c1b-60cc-4685-86cc-a457f645e87c-kube-api-access-jwj7t\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581217 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-kubelet\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581233 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-systemd\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581249 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/480f8c1b-60cc-4685-86cc-a457f645e87c-ovn-node-metrics-cert\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581286 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-slash\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581303 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-netns\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581326 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-systemd-units\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581341 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-script-lib\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581377 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581392 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-env-overrides\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581409 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-os-release\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581467 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-log-socket\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581495 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-netd\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581511 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581528 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvx6k\" (UniqueName: \"kubernetes.io/projected/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-kube-api-access-mvx6k\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581553 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-system-cni-dir\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581569 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-cni-binary-copy\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581587 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-cnibin\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581603 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581624 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-etc-openvswitch\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581644 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-ovn\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581689 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-netns\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581731 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-systemd-units\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581712 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-ovn\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.581780 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-bin\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582484 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-script-lib\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582513 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-config\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582523 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582559 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-var-lib-openvswitch\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582635 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582663 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-openvswitch\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582688 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-node-log\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582831 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-env-overrides\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582899 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-os-release\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582928 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-log-socket\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582945 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-kubelet\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582956 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-netd\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582987 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.582994 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-systemd\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.583284 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-system-cni-dir\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.583809 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-cni-binary-copy\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.583853 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-cnibin\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.584239 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.584277 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-etc-openvswitch\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.584304 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-slash\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.585211 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.587409 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/480f8c1b-60cc-4685-86cc-a457f645e87c-ovn-node-metrics-cert\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.602289 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvx6k\" (UniqueName: \"kubernetes.io/projected/290f0e95-e5fa-4b56-acb0-babc0cf3c5d9-kube-api-access-mvx6k\") pod \"multus-additional-cni-plugins-9nv7n\" (UID: \"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\") " pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.602797 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwj7t\" (UniqueName: \"kubernetes.io/projected/480f8c1b-60cc-4685-86cc-a457f645e87c-kube-api-access-jwj7t\") pod \"ovnkube-node-2458v\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.607763 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.607788 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.607797 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.607813 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.607824 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:24Z","lastTransitionTime":"2025-12-16T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.608876 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.684787 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.684900 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.684918 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"9838335202ee4a537c53f98eec7b4d24b2f3355b2d1ebf720d4da69891dbad3d"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.687734 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdpsg" event={"ID":"57f7e48b-7353-469c-ab9d-7f966c08d5f1","Type":"ContainerStarted","Data":"25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.687777 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdpsg" event={"ID":"57f7e48b-7353-469c-ab9d-7f966c08d5f1","Type":"ContainerStarted","Data":"c385dcba9db9ddd4b77a49906419ec0fa4f8faf2e50f0671bfbf1f35f266be26"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.689072 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6lqf6" event={"ID":"e13f8ca5-bf05-4740-be7d-81af5e57172b","Type":"ContainerStarted","Data":"cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.689160 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6lqf6" event={"ID":"e13f8ca5-bf05-4740-be7d-81af5e57172b","Type":"ContainerStarted","Data":"5e11ed91e2e71544141c840d295530afca2503e8c00fc55f625d58ef8bf3aec6"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.694058 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.703281 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.705893 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.710572 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.710634 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.710647 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.710700 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.710714 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:24Z","lastTransitionTime":"2025-12-16T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.722167 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.733923 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.752302 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.766716 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.788595 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.805170 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.813870 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.814299 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.814311 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.814333 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.814346 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:24Z","lastTransitionTime":"2025-12-16T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.820972 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.841507 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.854889 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.873947 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.917548 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.917832 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.917917 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.918000 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.918075 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:24Z","lastTransitionTime":"2025-12-16T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.926293 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.950676 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.965051 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:24 crc kubenswrapper[4728]: I1216 14:57:24.985710 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:24Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.001877 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.015888 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.020209 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.020380 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.020478 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.020578 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.020688 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:25Z","lastTransitionTime":"2025-12-16T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.028019 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.041595 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.056526 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.073881 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.086374 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.101794 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.112753 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.122838 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.123103 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.123228 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.123365 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.123515 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:25Z","lastTransitionTime":"2025-12-16T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.128306 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.144086 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.155506 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.169658 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.226190 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.226241 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.226252 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.226275 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.226292 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:25Z","lastTransitionTime":"2025-12-16T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.328820 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.329124 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.329137 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.329157 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.329169 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:25Z","lastTransitionTime":"2025-12-16T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.431964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.432056 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.432098 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.432134 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.432166 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:25Z","lastTransitionTime":"2025-12-16T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.505808 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:25 crc kubenswrapper[4728]: E1216 14:57:25.505980 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.535426 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.535490 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.535499 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.535517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.535528 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:25Z","lastTransitionTime":"2025-12-16T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.638736 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.638790 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.638802 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.638823 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.638835 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:25Z","lastTransitionTime":"2025-12-16T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.700459 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5" exitCode=0 Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.700556 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.701243 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"f6a5dea1b098263c8c76edc066809df48b63e0ac843c23652034542da629e763"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.703560 4728 generic.go:334] "Generic (PLEG): container finished" podID="290f0e95-e5fa-4b56-acb0-babc0cf3c5d9" containerID="e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981" exitCode=0 Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.703634 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" event={"ID":"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9","Type":"ContainerDied","Data":"e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.703678 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" event={"ID":"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9","Type":"ContainerStarted","Data":"929fe48f5956d98032f8684e79bc0a4ec64c689f46b758bff257c3ceabe1e3b5"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.719335 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.733969 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.742127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.742180 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.742196 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.742214 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.742226 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:25Z","lastTransitionTime":"2025-12-16T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.748938 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.763452 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.778254 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.791922 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.806985 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.833284 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.849127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.849213 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.849279 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.849306 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.849326 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:25Z","lastTransitionTime":"2025-12-16T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.849559 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.865818 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.882400 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.897917 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.911999 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.931095 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.945073 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.952127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.952490 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.952566 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.952638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.952721 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:25Z","lastTransitionTime":"2025-12-16T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.962913 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:25 crc kubenswrapper[4728]: I1216 14:57:25.978434 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.000181 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:25Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.010089 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.022540 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.032843 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.052643 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.055621 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.055672 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.055684 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.055705 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.055715 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:26Z","lastTransitionTime":"2025-12-16T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.070015 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.086116 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.105438 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.122767 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.155930 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.157977 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.158017 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.158027 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.158042 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.158052 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:26Z","lastTransitionTime":"2025-12-16T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.199357 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.199602 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:34.199563378 +0000 UTC m=+35.039742562 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.202041 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.260281 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.260320 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.260330 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.260344 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.260353 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:26Z","lastTransitionTime":"2025-12-16T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.301200 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.301262 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.301314 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.301348 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301356 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301508 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:34.301482769 +0000 UTC m=+35.141661753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301510 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301591 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301619 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301507 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301685 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301708 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301620 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:34.301588232 +0000 UTC m=+35.141767256 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301638 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301758 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:34.301743596 +0000 UTC m=+35.141922590 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.301781 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:34.301771917 +0000 UTC m=+35.141950921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.362706 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.362752 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.362765 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.362788 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.362801 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:26Z","lastTransitionTime":"2025-12-16T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.465484 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.465533 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.465545 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.465564 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.465577 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:26Z","lastTransitionTime":"2025-12-16T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.506194 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.506244 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.506366 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:26 crc kubenswrapper[4728]: E1216 14:57:26.506609 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.569803 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.569872 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.569891 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.569915 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.569931 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:26Z","lastTransitionTime":"2025-12-16T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.672565 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.672631 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.672647 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.672671 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.672687 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:26Z","lastTransitionTime":"2025-12-16T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.712022 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.712385 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.712403 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.712450 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.712460 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.712469 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.714272 4728 generic.go:334] "Generic (PLEG): container finished" podID="290f0e95-e5fa-4b56-acb0-babc0cf3c5d9" containerID="b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03" exitCode=0 Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.714318 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" event={"ID":"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9","Type":"ContainerDied","Data":"b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.735015 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.755073 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.768707 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.775692 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.775811 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.775835 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.775863 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.775881 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:26Z","lastTransitionTime":"2025-12-16T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.787242 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.803986 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.821987 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.836524 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.862874 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.878279 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.878954 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.879000 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.879015 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.879037 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.879050 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:26Z","lastTransitionTime":"2025-12-16T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.896169 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.924752 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.944791 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.962872 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.980666 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:26Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.981060 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.981104 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.981115 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.981132 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:26 crc kubenswrapper[4728]: I1216 14:57:26.981144 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:26Z","lastTransitionTime":"2025-12-16T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.083962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.084009 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.084023 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.084042 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.084055 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:27Z","lastTransitionTime":"2025-12-16T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.186772 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.186828 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.186841 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.186861 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.186877 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:27Z","lastTransitionTime":"2025-12-16T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.289561 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.289601 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.289609 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.289624 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.289633 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:27Z","lastTransitionTime":"2025-12-16T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.302617 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-hbwhm"] Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.303059 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hbwhm" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.305709 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.306084 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.306431 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.306754 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.312232 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2df7f0f-2588-4958-90ec-db2d62025b7f-host\") pod \"node-ca-hbwhm\" (UID: \"d2df7f0f-2588-4958-90ec-db2d62025b7f\") " pod="openshift-image-registry/node-ca-hbwhm" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.312334 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2df7f0f-2588-4958-90ec-db2d62025b7f-serviceca\") pod \"node-ca-hbwhm\" (UID: \"d2df7f0f-2588-4958-90ec-db2d62025b7f\") " pod="openshift-image-registry/node-ca-hbwhm" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.312511 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfx6z\" (UniqueName: \"kubernetes.io/projected/d2df7f0f-2588-4958-90ec-db2d62025b7f-kube-api-access-xfx6z\") pod \"node-ca-hbwhm\" (UID: \"d2df7f0f-2588-4958-90ec-db2d62025b7f\") " pod="openshift-image-registry/node-ca-hbwhm" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.327150 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.343669 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.404794 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.406491 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.406538 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.406554 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.406578 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.406593 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:27Z","lastTransitionTime":"2025-12-16T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.413327 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2df7f0f-2588-4958-90ec-db2d62025b7f-host\") pod \"node-ca-hbwhm\" (UID: \"d2df7f0f-2588-4958-90ec-db2d62025b7f\") " pod="openshift-image-registry/node-ca-hbwhm" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.413443 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2df7f0f-2588-4958-90ec-db2d62025b7f-serviceca\") pod \"node-ca-hbwhm\" (UID: \"d2df7f0f-2588-4958-90ec-db2d62025b7f\") " pod="openshift-image-registry/node-ca-hbwhm" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.413493 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfx6z\" (UniqueName: \"kubernetes.io/projected/d2df7f0f-2588-4958-90ec-db2d62025b7f-kube-api-access-xfx6z\") pod \"node-ca-hbwhm\" (UID: \"d2df7f0f-2588-4958-90ec-db2d62025b7f\") " pod="openshift-image-registry/node-ca-hbwhm" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.413574 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2df7f0f-2588-4958-90ec-db2d62025b7f-host\") pod \"node-ca-hbwhm\" (UID: \"d2df7f0f-2588-4958-90ec-db2d62025b7f\") " pod="openshift-image-registry/node-ca-hbwhm" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.414512 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2df7f0f-2588-4958-90ec-db2d62025b7f-serviceca\") pod \"node-ca-hbwhm\" (UID: \"d2df7f0f-2588-4958-90ec-db2d62025b7f\") " pod="openshift-image-registry/node-ca-hbwhm" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.423090 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.437924 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.438245 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfx6z\" (UniqueName: \"kubernetes.io/projected/d2df7f0f-2588-4958-90ec-db2d62025b7f-kube-api-access-xfx6z\") pod \"node-ca-hbwhm\" (UID: \"d2df7f0f-2588-4958-90ec-db2d62025b7f\") " pod="openshift-image-registry/node-ca-hbwhm" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.452589 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.465279 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.481032 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.498964 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.505951 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:27 crc kubenswrapper[4728]: E1216 14:57:27.506141 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.509372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.509448 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.509465 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.509483 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.509499 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:27Z","lastTransitionTime":"2025-12-16T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.513984 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.533030 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.565771 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.586213 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.605296 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.612383 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.612461 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.612473 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.612497 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.612512 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:27Z","lastTransitionTime":"2025-12-16T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.617962 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hbwhm" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.621545 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: W1216 14:57:27.637816 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2df7f0f_2588_4958_90ec_db2d62025b7f.slice/crio-f01fd1dce17010cef542d0a4b3c50ba8887b29465ed44c5a7259c00c4b406d09 WatchSource:0}: Error finding container f01fd1dce17010cef542d0a4b3c50ba8887b29465ed44c5a7259c00c4b406d09: Status 404 returned error can't find the container with id f01fd1dce17010cef542d0a4b3c50ba8887b29465ed44c5a7259c00c4b406d09 Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.715604 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.715668 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.715689 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.715714 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.715733 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:27Z","lastTransitionTime":"2025-12-16T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.720865 4728 generic.go:334] "Generic (PLEG): container finished" podID="290f0e95-e5fa-4b56-acb0-babc0cf3c5d9" containerID="fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80" exitCode=0 Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.720978 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" event={"ID":"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9","Type":"ContainerDied","Data":"fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.724174 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hbwhm" event={"ID":"d2df7f0f-2588-4958-90ec-db2d62025b7f","Type":"ContainerStarted","Data":"f01fd1dce17010cef542d0a4b3c50ba8887b29465ed44c5a7259c00c4b406d09"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.745032 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.760921 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.789267 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.804010 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.819400 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.825003 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.825043 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.825055 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.825073 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.825085 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:27Z","lastTransitionTime":"2025-12-16T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.844613 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.867877 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.885235 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.899027 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.914030 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.928573 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.928616 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.928632 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.928659 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.928676 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:27Z","lastTransitionTime":"2025-12-16T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.930519 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.949068 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:27 crc kubenswrapper[4728]: I1216 14:57:27.976607 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:27Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.018342 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.031838 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.031888 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.031899 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.031918 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.031932 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.054910 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.134838 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.134894 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.134905 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.134926 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.134939 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.237978 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.238060 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.238081 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.238109 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.238133 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.325021 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.325082 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.325099 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.325122 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.325140 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: E1216 14:57:28.347316 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.352535 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.352600 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.352618 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.352643 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.352665 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: E1216 14:57:28.375191 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.381734 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.381796 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.381815 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.381841 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.381860 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: E1216 14:57:28.404280 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.410598 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.410673 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.410700 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.410736 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.410760 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: E1216 14:57:28.434745 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.440870 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.440937 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.440953 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.440978 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.440995 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: E1216 14:57:28.464015 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: E1216 14:57:28.464236 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.466902 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.466977 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.466996 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.467024 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.467041 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.506436 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.506461 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:28 crc kubenswrapper[4728]: E1216 14:57:28.506700 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:28 crc kubenswrapper[4728]: E1216 14:57:28.506793 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.570997 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.571514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.571561 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.571596 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.571620 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.674831 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.674905 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.674926 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.674958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.674976 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.734511 4728 generic.go:334] "Generic (PLEG): container finished" podID="290f0e95-e5fa-4b56-acb0-babc0cf3c5d9" containerID="e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb" exitCode=0 Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.734933 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" event={"ID":"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9","Type":"ContainerDied","Data":"e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.736749 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hbwhm" event={"ID":"d2df7f0f-2588-4958-90ec-db2d62025b7f","Type":"ContainerStarted","Data":"36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.748841 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.758392 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.778968 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.779069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.779091 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.779124 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.779147 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.780544 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.802036 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.819265 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.840786 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.857666 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.873624 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.881698 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.881758 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.881775 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.881798 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.881813 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.889770 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.923548 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.939605 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.959448 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.982713 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.984399 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.984511 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.984532 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.984556 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.984572 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:28Z","lastTransitionTime":"2025-12-16T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:28 crc kubenswrapper[4728]: I1216 14:57:28.997596 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:28Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.010316 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.020432 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.033022 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.045591 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.064251 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.076538 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.087022 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.087049 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.087059 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.087074 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.087086 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:29Z","lastTransitionTime":"2025-12-16T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.088102 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.099286 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.113281 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.126195 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.139649 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.151500 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.178476 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.190373 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.190487 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.190513 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.190545 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.190568 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:29Z","lastTransitionTime":"2025-12-16T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.196626 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.213580 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.230332 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.266885 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.293063 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.293104 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.293113 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.293129 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.293140 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:29Z","lastTransitionTime":"2025-12-16T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.329702 4728 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.396697 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.396754 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.396768 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.396788 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.396813 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:29Z","lastTransitionTime":"2025-12-16T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.500456 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.500523 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.500540 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.500565 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.500584 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:29Z","lastTransitionTime":"2025-12-16T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.506153 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:29 crc kubenswrapper[4728]: E1216 14:57:29.506372 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.525853 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.542892 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.558958 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.576311 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.596871 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.602368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.602426 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.602438 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.602455 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.602466 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:29Z","lastTransitionTime":"2025-12-16T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.616486 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.637344 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.654098 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.670657 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.694757 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.705543 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.705584 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.705601 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.705627 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.705642 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:29Z","lastTransitionTime":"2025-12-16T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.709087 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.737167 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.757376 4728 generic.go:334] "Generic (PLEG): container finished" podID="290f0e95-e5fa-4b56-acb0-babc0cf3c5d9" containerID="6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572" exitCode=0 Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.757454 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" event={"ID":"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9","Type":"ContainerDied","Data":"6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572"} Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.793399 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.808239 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.808299 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.808316 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.808339 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.808358 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:29Z","lastTransitionTime":"2025-12-16T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.823043 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.856726 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.901378 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.910860 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.910894 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.910906 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.910928 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.910945 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:29Z","lastTransitionTime":"2025-12-16T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.938637 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:29 crc kubenswrapper[4728]: I1216 14:57:29.978963 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.014188 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.014230 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.014240 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.014258 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.014269 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:30Z","lastTransitionTime":"2025-12-16T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.015043 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.057005 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.110751 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.117151 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.117198 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.117217 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.117242 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.117260 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:30Z","lastTransitionTime":"2025-12-16T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.142291 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.183718 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.217351 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.219986 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.220225 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.220459 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.220719 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.220921 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:30Z","lastTransitionTime":"2025-12-16T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.258056 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.298014 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.323296 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.323831 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.324071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.324212 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.324349 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:30Z","lastTransitionTime":"2025-12-16T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.343209 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.378789 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.417848 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.428064 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.428148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.428161 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.428186 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.428204 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:30Z","lastTransitionTime":"2025-12-16T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.453142 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.505895 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.505911 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:30 crc kubenswrapper[4728]: E1216 14:57:30.506154 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:30 crc kubenswrapper[4728]: E1216 14:57:30.506329 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.531736 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.531784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.531802 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.531829 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.531846 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:30Z","lastTransitionTime":"2025-12-16T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.634948 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.635041 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.635061 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.635088 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.635108 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:30Z","lastTransitionTime":"2025-12-16T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.738193 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.738647 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.738821 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.738978 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.739113 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:30Z","lastTransitionTime":"2025-12-16T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.766579 4728 generic.go:334] "Generic (PLEG): container finished" podID="290f0e95-e5fa-4b56-acb0-babc0cf3c5d9" containerID="371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5" exitCode=0 Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.766658 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" event={"ID":"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9","Type":"ContainerDied","Data":"371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5"} Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.792484 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.821796 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.843313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.843374 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.843605 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.843632 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.843649 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:30Z","lastTransitionTime":"2025-12-16T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.854040 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.870896 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.886643 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.896479 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.911558 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.929238 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.942392 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.947238 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.947304 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.947323 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.947354 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.947374 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:30Z","lastTransitionTime":"2025-12-16T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.961475 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:30 crc kubenswrapper[4728]: I1216 14:57:30.975210 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:30Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.004517 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.033232 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.051949 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.052038 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.052055 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.052080 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.052096 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:31Z","lastTransitionTime":"2025-12-16T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.073726 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.093831 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.154568 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.154621 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.154636 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.154657 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.154667 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:31Z","lastTransitionTime":"2025-12-16T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.258080 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.258130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.258140 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.258159 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.258171 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:31Z","lastTransitionTime":"2025-12-16T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.361392 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.361447 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.361459 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.361477 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.361490 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:31Z","lastTransitionTime":"2025-12-16T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.464869 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.464931 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.464943 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.464964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.464977 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:31Z","lastTransitionTime":"2025-12-16T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.506488 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:31 crc kubenswrapper[4728]: E1216 14:57:31.506659 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.568187 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.568264 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.568284 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.568314 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.568335 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:31Z","lastTransitionTime":"2025-12-16T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.572637 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.590479 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.607566 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.626106 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.644585 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.665992 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.671441 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.671509 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.671527 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.671553 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.671574 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:31Z","lastTransitionTime":"2025-12-16T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.680256 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.695830 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.708641 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.739304 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.757971 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.777336 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.777384 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.777449 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.777471 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.777488 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:31Z","lastTransitionTime":"2025-12-16T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.781499 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" event={"ID":"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9","Type":"ContainerStarted","Data":"d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.783092 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.789518 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.789998 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.790054 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.804080 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.825031 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.825625 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.826846 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.848584 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.868659 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.880976 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.881021 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.881038 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.881065 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.881080 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:31Z","lastTransitionTime":"2025-12-16T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.886838 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.903656 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.926557 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.944094 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.959064 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.970940 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.984215 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.984297 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.984320 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.984353 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.984378 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:31Z","lastTransitionTime":"2025-12-16T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:31 crc kubenswrapper[4728]: I1216 14:57:31.988528 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:31Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.007934 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:32Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.033176 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:32Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.060530 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:32Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.087965 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.088016 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.088031 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.088054 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.088069 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:32Z","lastTransitionTime":"2025-12-16T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.101397 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:32Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.156030 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:32Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.184478 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:32Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.195556 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.195638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.195657 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.195685 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.195703 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:32Z","lastTransitionTime":"2025-12-16T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.223588 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:32Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.262526 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:32Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.299117 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.299167 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.299190 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.299215 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.299235 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:32Z","lastTransitionTime":"2025-12-16T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.402595 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.402672 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.402691 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.402717 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.402736 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:32Z","lastTransitionTime":"2025-12-16T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.505994 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.506114 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:32 crc kubenswrapper[4728]: E1216 14:57:32.506308 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:32 crc kubenswrapper[4728]: E1216 14:57:32.506518 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.506866 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.506930 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.506955 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.506988 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.507011 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:32Z","lastTransitionTime":"2025-12-16T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.610350 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.610470 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.610494 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.610529 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.610559 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:32Z","lastTransitionTime":"2025-12-16T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.713613 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.713668 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.713680 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.713701 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.713716 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:32Z","lastTransitionTime":"2025-12-16T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.793287 4728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.816400 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.816494 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.816511 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.816541 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.816558 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:32Z","lastTransitionTime":"2025-12-16T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.918916 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.918986 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.919004 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.919032 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.919049 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:32Z","lastTransitionTime":"2025-12-16T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:32 crc kubenswrapper[4728]: I1216 14:57:32.980877 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.022368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.022441 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.022586 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.022700 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.022753 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:33Z","lastTransitionTime":"2025-12-16T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.125525 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.125581 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.125592 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.125623 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.125640 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:33Z","lastTransitionTime":"2025-12-16T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.228994 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.229107 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.229127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.229229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.229248 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:33Z","lastTransitionTime":"2025-12-16T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.334758 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.334830 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.334848 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.334875 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.334893 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:33Z","lastTransitionTime":"2025-12-16T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.437904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.438003 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.438021 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.438057 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.438077 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:33Z","lastTransitionTime":"2025-12-16T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.505610 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:33 crc kubenswrapper[4728]: E1216 14:57:33.505857 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.547253 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.547311 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.547324 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.547346 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.547359 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:33Z","lastTransitionTime":"2025-12-16T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.649728 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.649778 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.649790 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.649809 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.649820 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:33Z","lastTransitionTime":"2025-12-16T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.752549 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.752629 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.752649 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.752681 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.752703 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:33Z","lastTransitionTime":"2025-12-16T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.855695 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.855751 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.855763 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.855783 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.855796 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:33Z","lastTransitionTime":"2025-12-16T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.959020 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.959102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.959130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.959156 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:33 crc kubenswrapper[4728]: I1216 14:57:33.959177 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:33Z","lastTransitionTime":"2025-12-16T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.062912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.062976 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.062993 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.063020 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.063041 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:34Z","lastTransitionTime":"2025-12-16T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.166871 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.166936 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.166953 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.166980 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.166998 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:34Z","lastTransitionTime":"2025-12-16T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.269935 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.269988 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.270001 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.270021 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.270035 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:34Z","lastTransitionTime":"2025-12-16T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.289250 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.289484 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:57:50.289455067 +0000 UTC m=+51.129634071 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.373080 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.373162 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.373183 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.373214 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.373235 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:34Z","lastTransitionTime":"2025-12-16T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.390682 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.390728 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.390759 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.390787 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.390917 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.390924 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.390925 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.391017 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.391073 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.390936 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.391091 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.391097 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.391026 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:50.391000557 +0000 UTC m=+51.231179551 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.391154 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:50.391137571 +0000 UTC m=+51.231316555 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.391172 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:50.391164972 +0000 UTC m=+51.231343946 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.391183 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:50.391177722 +0000 UTC m=+51.231356696 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.476830 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.476884 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.476900 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.476924 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.476941 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:34Z","lastTransitionTime":"2025-12-16T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.505642 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.505665 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.505854 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:34 crc kubenswrapper[4728]: E1216 14:57:34.505971 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.580953 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.581247 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.581418 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.581546 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.581652 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:34Z","lastTransitionTime":"2025-12-16T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.684931 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.684995 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.685008 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.685026 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.685040 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:34Z","lastTransitionTime":"2025-12-16T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.787761 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.787838 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.787862 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.787891 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.787913 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:34Z","lastTransitionTime":"2025-12-16T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.891172 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.891253 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.891275 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.891304 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.891327 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:34Z","lastTransitionTime":"2025-12-16T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.994868 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.994927 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.994945 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.994970 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:34 crc kubenswrapper[4728]: I1216 14:57:34.994988 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:34Z","lastTransitionTime":"2025-12-16T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.099426 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.099484 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.099498 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.099518 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.099532 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:35Z","lastTransitionTime":"2025-12-16T14:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.202109 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.202199 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.202217 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.202245 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.202266 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:35Z","lastTransitionTime":"2025-12-16T14:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.305953 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.306007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.306019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.306038 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.306051 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:35Z","lastTransitionTime":"2025-12-16T14:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.409968 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.410019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.410028 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.410047 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.410058 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:35Z","lastTransitionTime":"2025-12-16T14:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.506062 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:35 crc kubenswrapper[4728]: E1216 14:57:35.506267 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.512076 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.512137 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.512158 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.512184 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.512203 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:35Z","lastTransitionTime":"2025-12-16T14:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.616050 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.616134 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.616160 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.616192 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.616217 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:35Z","lastTransitionTime":"2025-12-16T14:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.720004 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.720076 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.720097 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.720127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.720146 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:35Z","lastTransitionTime":"2025-12-16T14:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.808262 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/0.log" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.812699 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e" exitCode=1 Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.812759 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e"} Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.813857 4728 scope.go:117] "RemoveContainer" containerID="e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.823110 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.823177 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.823199 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.823225 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.823250 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:35Z","lastTransitionTime":"2025-12-16T14:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.840694 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.864614 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.882255 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.910170 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.926351 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.926452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.926482 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.926508 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.926527 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:35Z","lastTransitionTime":"2025-12-16T14:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.930107 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.953152 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.971237 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:35 crc kubenswrapper[4728]: I1216 14:57:35.987521 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:35Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.021541 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.030463 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.030878 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.030893 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.030937 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.030952 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:36Z","lastTransitionTime":"2025-12-16T14:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.050189 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.069162 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.091306 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.109041 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.128871 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.134007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.134053 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.134067 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.134088 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.134102 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:36Z","lastTransitionTime":"2025-12-16T14:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.161013 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:34Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:33.628266 6043 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 14:57:33.629012 6043 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:33.629058 6043 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:33.629072 6043 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:33.629150 6043 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:33.629160 6043 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:33.629170 6043 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:33.629181 6043 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:33.629209 6043 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:33.629210 6043 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:33.629254 6043 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:33.629294 6043 factory.go:656] Stopping watch factory\\\\nI1216 14:57:33.629298 6043 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:33.629313 6043 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.236916 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.237000 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.237020 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.237051 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.237070 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:36Z","lastTransitionTime":"2025-12-16T14:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.340162 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.340227 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.340246 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.340270 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.340288 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:36Z","lastTransitionTime":"2025-12-16T14:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.443699 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.443808 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.443826 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.443852 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.443869 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:36Z","lastTransitionTime":"2025-12-16T14:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.506159 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.506159 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:36 crc kubenswrapper[4728]: E1216 14:57:36.506356 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:36 crc kubenswrapper[4728]: E1216 14:57:36.506477 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.507232 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc"] Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.508020 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.511633 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.512049 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.546943 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.547257 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.547473 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.547641 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.547805 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:36Z","lastTransitionTime":"2025-12-16T14:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.547959 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.570703 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.592878 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.613080 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.614473 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tcz6\" (UniqueName: \"kubernetes.io/projected/dcc79473-539a-47f0-ba53-932ad31f7422-kube-api-access-2tcz6\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.614524 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dcc79473-539a-47f0-ba53-932ad31f7422-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.614572 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dcc79473-539a-47f0-ba53-932ad31f7422-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.614614 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dcc79473-539a-47f0-ba53-932ad31f7422-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.633239 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.650825 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.650876 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.650889 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.650908 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.650922 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:36Z","lastTransitionTime":"2025-12-16T14:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.658098 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.680120 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:34Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:33.628266 6043 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 14:57:33.629012 6043 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:33.629058 6043 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:33.629072 6043 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:33.629150 6043 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:33.629160 6043 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:33.629170 6043 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:33.629181 6043 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:33.629209 6043 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:33.629210 6043 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:33.629254 6043 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:33.629294 6043 factory.go:656] Stopping watch factory\\\\nI1216 14:57:33.629298 6043 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:33.629313 6043 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.696849 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.716110 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dcc79473-539a-47f0-ba53-932ad31f7422-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.716195 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tcz6\" (UniqueName: \"kubernetes.io/projected/dcc79473-539a-47f0-ba53-932ad31f7422-kube-api-access-2tcz6\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.716246 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dcc79473-539a-47f0-ba53-932ad31f7422-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.716303 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dcc79473-539a-47f0-ba53-932ad31f7422-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.717051 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dcc79473-539a-47f0-ba53-932ad31f7422-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.717215 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dcc79473-539a-47f0-ba53-932ad31f7422-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.717882 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.725184 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dcc79473-539a-47f0-ba53-932ad31f7422-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.737373 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.738079 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tcz6\" (UniqueName: \"kubernetes.io/projected/dcc79473-539a-47f0-ba53-932ad31f7422-kube-api-access-2tcz6\") pod \"ovnkube-control-plane-749d76644c-2gtsc\" (UID: \"dcc79473-539a-47f0-ba53-932ad31f7422\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.753994 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.754251 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.754267 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.754276 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.754291 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.754300 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:36Z","lastTransitionTime":"2025-12-16T14:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.771197 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.787955 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.801778 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.818437 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/0.log" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.818916 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.821636 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af"} Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.822091 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.827607 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.832753 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.846669 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.857162 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.857222 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.857239 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.857262 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.857280 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:36Z","lastTransitionTime":"2025-12-16T14:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.869031 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.885794 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.903186 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.953134 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.959753 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.959801 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.959830 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.959852 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.959868 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:36Z","lastTransitionTime":"2025-12-16T14:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:36 crc kubenswrapper[4728]: I1216 14:57:36.985956 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:36Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.009971 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.029219 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.052660 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.063514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.063577 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.063591 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.063612 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.063625 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:37Z","lastTransitionTime":"2025-12-16T14:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.065134 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.079318 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.095263 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.113861 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:34Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:33.628266 6043 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 14:57:33.629012 6043 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:33.629058 6043 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:33.629072 6043 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:33.629150 6043 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:33.629160 6043 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:33.629170 6043 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:33.629181 6043 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:33.629209 6043 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:33.629210 6043 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:33.629254 6043 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:33.629294 6043 factory.go:656] Stopping watch factory\\\\nI1216 14:57:33.629298 6043 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:33.629313 6043 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.128555 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.143827 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.155832 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.166609 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.166654 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.166662 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.166679 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.166689 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:37Z","lastTransitionTime":"2025-12-16T14:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.269086 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.269127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.269136 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.269154 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.269164 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:37Z","lastTransitionTime":"2025-12-16T14:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.371935 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.371981 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.371990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.372006 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.372016 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:37Z","lastTransitionTime":"2025-12-16T14:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.474685 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.474780 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.474808 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.474846 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.474872 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:37Z","lastTransitionTime":"2025-12-16T14:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.506564 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:37 crc kubenswrapper[4728]: E1216 14:57:37.506812 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.578066 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.578120 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.578138 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.578169 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.578188 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:37Z","lastTransitionTime":"2025-12-16T14:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.681217 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.681255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.681266 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.681284 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.681295 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:37Z","lastTransitionTime":"2025-12-16T14:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.784639 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.784727 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.784750 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.784781 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.784806 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:37Z","lastTransitionTime":"2025-12-16T14:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.826858 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/1.log" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.827700 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/0.log" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.831373 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af" exitCode=1 Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.831452 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.831505 4728 scope.go:117] "RemoveContainer" containerID="e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.832803 4728 scope.go:117] "RemoveContainer" containerID="70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af" Dec 16 14:57:37 crc kubenswrapper[4728]: E1216 14:57:37.833189 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.833837 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" event={"ID":"dcc79473-539a-47f0-ba53-932ad31f7422","Type":"ContainerStarted","Data":"105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.833891 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" event={"ID":"dcc79473-539a-47f0-ba53-932ad31f7422","Type":"ContainerStarted","Data":"df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.833922 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" event={"ID":"dcc79473-539a-47f0-ba53-932ad31f7422","Type":"ContainerStarted","Data":"b18fe08eabe0acd30f96725e83ec78762d059299471c5a490f07d04f05ac4f8b"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.847374 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.868874 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.887715 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.887793 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.887811 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.887838 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.887857 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:37Z","lastTransitionTime":"2025-12-16T14:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.899017 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:34Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:33.628266 6043 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 14:57:33.629012 6043 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:33.629058 6043 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:33.629072 6043 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:33.629150 6043 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:33.629160 6043 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:33.629170 6043 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:33.629181 6043 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:33.629209 6043 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:33.629210 6043 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:33.629254 6043 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:33.629294 6043 factory.go:656] Stopping watch factory\\\\nI1216 14:57:33.629298 6043 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:33.629313 6043 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:37.472335 6188 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:37.472436 6188 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:37.472449 6188 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:37.472522 6188 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:37.472549 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:37.472565 6188 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:57:37.472613 6188 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:37.472618 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:37.472628 6188 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:37.472637 6188 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:57:37.472649 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:37.472649 6188 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:37.472672 6188 factory.go:656] Stopping watch factory\\\\nI1216 14:57:37.472677 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:37.472690 6188 ovnkube.go:599] Stopped ovnkube\\\\nI1216 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.914202 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.933161 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.948357 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.959876 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.976034 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.989375 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:37Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.990960 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.991011 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.991028 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.991053 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:37 crc kubenswrapper[4728]: I1216 14:57:37.991071 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:37Z","lastTransitionTime":"2025-12-16T14:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.005217 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.021663 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.034981 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.058501 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.066583 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kjxbh"] Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.067179 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.067256 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.077595 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.093672 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.093735 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.093748 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.093768 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.093784 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.094309 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.112893 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.129272 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.131881 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.132033 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mzfb\" (UniqueName: \"kubernetes.io/projected/d13ff897-af48-416f-ba3f-44f7e4344a75-kube-api-access-4mzfb\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.150328 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.169485 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e83968662a4b748acd5e80230cb956c205758703574f660136f6852435e8aa5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:34Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:33.628266 6043 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 14:57:33.629012 6043 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:33.629058 6043 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:33.629072 6043 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:33.629150 6043 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:33.629160 6043 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:33.629170 6043 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:33.629181 6043 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:33.629209 6043 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:33.629210 6043 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:33.629254 6043 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:33.629294 6043 factory.go:656] Stopping watch factory\\\\nI1216 14:57:33.629298 6043 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:33.629313 6043 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:37.472335 6188 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:37.472436 6188 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:37.472449 6188 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:37.472522 6188 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:37.472549 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:37.472565 6188 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:57:37.472613 6188 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:37.472618 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:37.472628 6188 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:37.472637 6188 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:57:37.472649 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:37.472649 6188 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:37.472672 6188 factory.go:656] Stopping watch factory\\\\nI1216 14:57:37.472677 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:37.472690 6188 ovnkube.go:599] Stopped ovnkube\\\\nI1216 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.183746 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.196652 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.196726 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.196744 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.196769 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.196786 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.203941 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.215933 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.227233 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.233272 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mzfb\" (UniqueName: \"kubernetes.io/projected/d13ff897-af48-416f-ba3f-44f7e4344a75-kube-api-access-4mzfb\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.233350 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.233614 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.233745 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs podName:d13ff897-af48-416f-ba3f-44f7e4344a75 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:38.733714445 +0000 UTC m=+39.573893639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs") pod "network-metrics-daemon-kjxbh" (UID: "d13ff897-af48-416f-ba3f-44f7e4344a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.253791 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.266213 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mzfb\" (UniqueName: \"kubernetes.io/projected/d13ff897-af48-416f-ba3f-44f7e4344a75-kube-api-access-4mzfb\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.305505 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.306261 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.306292 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.306302 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.306319 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.306328 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.318236 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.331659 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.342669 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.353508 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.374272 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.395302 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.410071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.410103 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.410114 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.410130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.410141 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.410363 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.424488 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.506283 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.506383 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.506504 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.506718 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.513493 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.513536 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.513568 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.513591 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.513610 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.616860 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.616979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.616995 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.617019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.617035 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.719834 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.719902 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.719920 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.719944 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.719963 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.739575 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.739877 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.739978 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs podName:d13ff897-af48-416f-ba3f-44f7e4344a75 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:39.739947056 +0000 UTC m=+40.580126080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs") pod "network-metrics-daemon-kjxbh" (UID: "d13ff897-af48-416f-ba3f-44f7e4344a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.787017 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.787088 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.787105 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.787130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.787147 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.810883 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.816999 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.817079 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.817104 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.817131 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.817150 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.840095 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/1.log" Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.841323 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.845215 4728 scope.go:117] "RemoveContainer" containerID="70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af" Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.845691 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.847025 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.847100 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.847125 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.847157 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.847182 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.868597 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.872665 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.873958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.874033 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.874093 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.874121 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.874141 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.894925 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.898253 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.900284 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.900372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.900439 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.900470 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.900491 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.920656 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: E1216 14:57:38.920886 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.923017 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.923071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.923093 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.923120 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.923137 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:38Z","lastTransitionTime":"2025-12-16T14:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.932453 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:37.472335 6188 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:37.472436 6188 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:37.472449 6188 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:37.472522 6188 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:37.472549 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:37.472565 6188 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:57:37.472613 6188 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:37.472618 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:37.472628 6188 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:37.472637 6188 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:57:37.472649 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:37.472649 6188 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:37.472672 6188 factory.go:656] Stopping watch factory\\\\nI1216 14:57:37.472677 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:37.472690 6188 ovnkube.go:599] Stopped ovnkube\\\\nI1216 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.951342 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.971144 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:38 crc kubenswrapper[4728]: I1216 14:57:38.990063 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:38Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.005192 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.025513 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.025570 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.025587 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.025612 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.025629 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:39Z","lastTransitionTime":"2025-12-16T14:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.029844 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.048249 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.068394 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.085279 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.103210 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.161372 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.161437 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.161451 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.161469 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.161481 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:39Z","lastTransitionTime":"2025-12-16T14:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.165053 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.187909 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.203039 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.225253 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.244602 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.264260 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.264333 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.264353 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.264382 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.264401 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:39Z","lastTransitionTime":"2025-12-16T14:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.367646 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.368128 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.368149 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.368179 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.368196 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:39Z","lastTransitionTime":"2025-12-16T14:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.470923 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.470990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.471007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.471031 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.471050 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:39Z","lastTransitionTime":"2025-12-16T14:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.506516 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.506582 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:39 crc kubenswrapper[4728]: E1216 14:57:39.506777 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:39 crc kubenswrapper[4728]: E1216 14:57:39.506982 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.528191 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.546956 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.564306 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.573642 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.573689 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.573704 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.573722 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.573735 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:39Z","lastTransitionTime":"2025-12-16T14:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.585781 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.611792 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.646173 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.664003 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.678657 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.678710 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.678725 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.678745 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.678759 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:39Z","lastTransitionTime":"2025-12-16T14:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.681775 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.704489 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.724678 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.745747 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.763573 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.767203 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:39 crc kubenswrapper[4728]: E1216 14:57:39.767536 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:39 crc kubenswrapper[4728]: E1216 14:57:39.767653 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs podName:d13ff897-af48-416f-ba3f-44f7e4344a75 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:41.767626242 +0000 UTC m=+42.607805266 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs") pod "network-metrics-daemon-kjxbh" (UID: "d13ff897-af48-416f-ba3f-44f7e4344a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.781518 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.781571 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.781583 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.781600 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.781612 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:39Z","lastTransitionTime":"2025-12-16T14:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.787538 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.806898 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.829233 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.860145 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:37.472335 6188 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:37.472436 6188 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:37.472449 6188 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:37.472522 6188 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:37.472549 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:37.472565 6188 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:57:37.472613 6188 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:37.472618 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:37.472628 6188 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:37.472637 6188 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:57:37.472649 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:37.472649 6188 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:37.472672 6188 factory.go:656] Stopping watch factory\\\\nI1216 14:57:37.472677 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:37.472690 6188 ovnkube.go:599] Stopped ovnkube\\\\nI1216 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.878613 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:39Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.884397 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.884483 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.884502 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.884527 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.884546 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:39Z","lastTransitionTime":"2025-12-16T14:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.987747 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.987781 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.987789 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.987803 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:39 crc kubenswrapper[4728]: I1216 14:57:39.987813 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:39Z","lastTransitionTime":"2025-12-16T14:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.090571 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.090645 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.090670 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.090696 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.090713 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:40Z","lastTransitionTime":"2025-12-16T14:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.193758 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.194050 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.194135 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.194240 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.194366 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:40Z","lastTransitionTime":"2025-12-16T14:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.297381 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.297439 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.297450 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.297495 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.297510 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:40Z","lastTransitionTime":"2025-12-16T14:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.401146 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.401218 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.401241 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.401274 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.401296 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:40Z","lastTransitionTime":"2025-12-16T14:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.504646 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.504695 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.504708 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.504727 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.504742 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:40Z","lastTransitionTime":"2025-12-16T14:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.505508 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.505590 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:40 crc kubenswrapper[4728]: E1216 14:57:40.505668 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:40 crc kubenswrapper[4728]: E1216 14:57:40.505789 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.606939 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.607012 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.607035 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.607065 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.607088 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:40Z","lastTransitionTime":"2025-12-16T14:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.709900 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.709966 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.709989 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.710019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.710041 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:40Z","lastTransitionTime":"2025-12-16T14:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.813060 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.813121 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.813138 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.813164 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.813182 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:40Z","lastTransitionTime":"2025-12-16T14:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.916259 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.916321 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.916338 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.916363 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:40 crc kubenswrapper[4728]: I1216 14:57:40.916382 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:40Z","lastTransitionTime":"2025-12-16T14:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.019709 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.019771 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.019788 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.019811 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.019828 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:41Z","lastTransitionTime":"2025-12-16T14:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.123091 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.123168 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.123193 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.123221 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.123244 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:41Z","lastTransitionTime":"2025-12-16T14:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.226538 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.226618 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.226645 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.226677 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.226700 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:41Z","lastTransitionTime":"2025-12-16T14:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.330125 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.330186 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.330202 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.330230 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.330247 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:41Z","lastTransitionTime":"2025-12-16T14:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.434123 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.434168 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.434185 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.434207 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.434224 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:41Z","lastTransitionTime":"2025-12-16T14:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.506056 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.506324 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:41 crc kubenswrapper[4728]: E1216 14:57:41.506648 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:41 crc kubenswrapper[4728]: E1216 14:57:41.506877 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.536930 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.536998 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.537016 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.537045 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.537067 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:41Z","lastTransitionTime":"2025-12-16T14:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.639999 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.640053 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.640070 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.640092 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.640110 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:41Z","lastTransitionTime":"2025-12-16T14:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.743708 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.743792 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.743812 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.743839 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.743860 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:41Z","lastTransitionTime":"2025-12-16T14:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.793306 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:41 crc kubenswrapper[4728]: E1216 14:57:41.793517 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:41 crc kubenswrapper[4728]: E1216 14:57:41.793590 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs podName:d13ff897-af48-416f-ba3f-44f7e4344a75 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:45.793566793 +0000 UTC m=+46.633745807 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs") pod "network-metrics-daemon-kjxbh" (UID: "d13ff897-af48-416f-ba3f-44f7e4344a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.847202 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.847263 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.847281 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.847304 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.847322 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:41Z","lastTransitionTime":"2025-12-16T14:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.951015 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.951667 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.951898 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.952071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:41 crc kubenswrapper[4728]: I1216 14:57:41.952220 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:41Z","lastTransitionTime":"2025-12-16T14:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.056377 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.056479 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.056497 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.056524 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.056541 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:42Z","lastTransitionTime":"2025-12-16T14:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.160209 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.160271 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.160288 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.160313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.160333 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:42Z","lastTransitionTime":"2025-12-16T14:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.263128 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.263180 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.263201 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.263228 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.263249 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:42Z","lastTransitionTime":"2025-12-16T14:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.366928 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.366997 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.367020 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.367047 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.367072 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:42Z","lastTransitionTime":"2025-12-16T14:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.470194 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.470247 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.470266 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.470290 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.470308 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:42Z","lastTransitionTime":"2025-12-16T14:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.506200 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.506278 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:42 crc kubenswrapper[4728]: E1216 14:57:42.506489 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:42 crc kubenswrapper[4728]: E1216 14:57:42.506634 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.574074 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.574160 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.574183 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.574213 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.574238 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:42Z","lastTransitionTime":"2025-12-16T14:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.677041 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.677123 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.677150 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.677180 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.677202 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:42Z","lastTransitionTime":"2025-12-16T14:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.779112 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.779161 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.779172 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.779189 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.779200 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:42Z","lastTransitionTime":"2025-12-16T14:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.881532 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.881569 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.881580 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.881597 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.881608 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:42Z","lastTransitionTime":"2025-12-16T14:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.984495 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.984805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.985051 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.985281 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:42 crc kubenswrapper[4728]: I1216 14:57:42.985534 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:42Z","lastTransitionTime":"2025-12-16T14:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.088083 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.088143 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.088168 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.088196 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.088216 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:43Z","lastTransitionTime":"2025-12-16T14:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.191455 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.191520 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.191538 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.191557 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.191568 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:43Z","lastTransitionTime":"2025-12-16T14:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.295177 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.295239 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.295255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.295315 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.295333 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:43Z","lastTransitionTime":"2025-12-16T14:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.398948 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.399005 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.399022 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.399050 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.399070 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:43Z","lastTransitionTime":"2025-12-16T14:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.501989 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.502057 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.502079 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.502107 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.502126 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:43Z","lastTransitionTime":"2025-12-16T14:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.505634 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.505725 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:43 crc kubenswrapper[4728]: E1216 14:57:43.505806 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:43 crc kubenswrapper[4728]: E1216 14:57:43.505968 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.606002 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.606083 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.606104 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.606132 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.606151 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:43Z","lastTransitionTime":"2025-12-16T14:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.709295 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.709349 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.709363 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.709384 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.709400 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:43Z","lastTransitionTime":"2025-12-16T14:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.812326 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.812402 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.812460 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.812486 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.812512 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:43Z","lastTransitionTime":"2025-12-16T14:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.916623 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.916678 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.916694 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.916720 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:43 crc kubenswrapper[4728]: I1216 14:57:43.916737 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:43Z","lastTransitionTime":"2025-12-16T14:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.019822 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.020168 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.020310 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.020481 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.020680 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:44Z","lastTransitionTime":"2025-12-16T14:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.123872 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.123924 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.123934 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.123948 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.123957 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:44Z","lastTransitionTime":"2025-12-16T14:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.226551 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.227378 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.227585 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.227767 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.227939 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:44Z","lastTransitionTime":"2025-12-16T14:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.330689 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.331057 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.331273 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.331452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.331607 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:44Z","lastTransitionTime":"2025-12-16T14:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.435112 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.435170 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.435188 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.435209 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.435226 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:44Z","lastTransitionTime":"2025-12-16T14:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.505972 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:44 crc kubenswrapper[4728]: E1216 14:57:44.506140 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.506549 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:44 crc kubenswrapper[4728]: E1216 14:57:44.506844 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.538092 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.538378 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.538628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.538803 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.539050 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:44Z","lastTransitionTime":"2025-12-16T14:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.642322 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.642385 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.642435 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.642463 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.642482 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:44Z","lastTransitionTime":"2025-12-16T14:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.745000 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.745043 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.745059 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.745082 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.745098 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:44Z","lastTransitionTime":"2025-12-16T14:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.848446 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.848779 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.848930 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.849134 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.849314 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:44Z","lastTransitionTime":"2025-12-16T14:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.952653 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.952719 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.952741 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.952769 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:44 crc kubenswrapper[4728]: I1216 14:57:44.952792 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:44Z","lastTransitionTime":"2025-12-16T14:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.055570 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.055620 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.055663 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.055688 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.055703 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:45Z","lastTransitionTime":"2025-12-16T14:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.158564 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.158611 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.158624 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.158640 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.158652 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:45Z","lastTransitionTime":"2025-12-16T14:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.261000 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.261045 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.261058 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.261075 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.261088 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:45Z","lastTransitionTime":"2025-12-16T14:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.364102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.364194 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.364214 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.364647 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.364714 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:45Z","lastTransitionTime":"2025-12-16T14:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.468797 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.468864 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.469127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.469595 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.469640 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:45Z","lastTransitionTime":"2025-12-16T14:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.505745 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:45 crc kubenswrapper[4728]: E1216 14:57:45.505899 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.505755 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:45 crc kubenswrapper[4728]: E1216 14:57:45.506169 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.572605 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.572655 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.572673 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.572698 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.572717 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:45Z","lastTransitionTime":"2025-12-16T14:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.675285 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.675366 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.675388 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.675452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.675484 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:45Z","lastTransitionTime":"2025-12-16T14:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.778819 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.778883 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.778907 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.778935 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.778956 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:45Z","lastTransitionTime":"2025-12-16T14:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.839087 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:45 crc kubenswrapper[4728]: E1216 14:57:45.839498 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:45 crc kubenswrapper[4728]: E1216 14:57:45.839667 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs podName:d13ff897-af48-416f-ba3f-44f7e4344a75 nodeName:}" failed. No retries permitted until 2025-12-16 14:57:53.839627608 +0000 UTC m=+54.679806642 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs") pod "network-metrics-daemon-kjxbh" (UID: "d13ff897-af48-416f-ba3f-44f7e4344a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.882395 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.882523 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.882550 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.882580 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.882604 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:45Z","lastTransitionTime":"2025-12-16T14:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.985990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.986037 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.986052 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.986072 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:45 crc kubenswrapper[4728]: I1216 14:57:45.986086 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:45Z","lastTransitionTime":"2025-12-16T14:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.089348 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.089385 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.089394 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.089427 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.089439 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:46Z","lastTransitionTime":"2025-12-16T14:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.192718 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.192754 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.192767 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.192784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.192795 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:46Z","lastTransitionTime":"2025-12-16T14:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.295376 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.295501 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.295521 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.295548 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.295587 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:46Z","lastTransitionTime":"2025-12-16T14:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.399325 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.399401 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.399444 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.399470 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.399488 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:46Z","lastTransitionTime":"2025-12-16T14:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.502059 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.502130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.502155 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.502187 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.502210 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:46Z","lastTransitionTime":"2025-12-16T14:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.505760 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.505760 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:46 crc kubenswrapper[4728]: E1216 14:57:46.505957 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:46 crc kubenswrapper[4728]: E1216 14:57:46.506100 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.604673 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.604754 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.604765 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.604778 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.604786 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:46Z","lastTransitionTime":"2025-12-16T14:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.708219 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.708286 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.708309 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.708334 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.708353 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:46Z","lastTransitionTime":"2025-12-16T14:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.811809 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.811873 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.811890 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.811915 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.811933 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:46Z","lastTransitionTime":"2025-12-16T14:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.914263 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.914316 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.914334 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.914361 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:46 crc kubenswrapper[4728]: I1216 14:57:46.914381 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:46Z","lastTransitionTime":"2025-12-16T14:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.017347 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.017392 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.017421 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.017436 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.017446 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:47Z","lastTransitionTime":"2025-12-16T14:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.120890 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.120957 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.120979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.121008 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.121030 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:47Z","lastTransitionTime":"2025-12-16T14:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.223962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.224024 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.224041 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.224065 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.224090 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:47Z","lastTransitionTime":"2025-12-16T14:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.327681 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.327764 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.327789 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.327816 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.327834 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:47Z","lastTransitionTime":"2025-12-16T14:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.430869 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.430924 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.430941 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.430964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.430982 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:47Z","lastTransitionTime":"2025-12-16T14:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.505629 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.505686 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:47 crc kubenswrapper[4728]: E1216 14:57:47.505895 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:47 crc kubenswrapper[4728]: E1216 14:57:47.506149 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.534109 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.534179 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.534195 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.534221 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.534240 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:47Z","lastTransitionTime":"2025-12-16T14:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.636920 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.636990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.637010 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.637038 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.637062 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:47Z","lastTransitionTime":"2025-12-16T14:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.740518 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.740876 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.741018 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.741157 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.741340 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:47Z","lastTransitionTime":"2025-12-16T14:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.850247 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.850563 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.850752 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.850917 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.851068 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:47Z","lastTransitionTime":"2025-12-16T14:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.954305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.954383 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.954430 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.954455 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:47 crc kubenswrapper[4728]: I1216 14:57:47.954476 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:47Z","lastTransitionTime":"2025-12-16T14:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.057563 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.057649 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.057660 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.057681 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.057699 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:48Z","lastTransitionTime":"2025-12-16T14:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.160666 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.160724 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.160741 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.160766 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.160783 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:48Z","lastTransitionTime":"2025-12-16T14:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.264636 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.264942 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.265119 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.265271 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.265443 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:48Z","lastTransitionTime":"2025-12-16T14:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.368965 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.369043 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.369071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.369101 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.369124 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:48Z","lastTransitionTime":"2025-12-16T14:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.471899 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.471956 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.471974 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.471996 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.472012 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:48Z","lastTransitionTime":"2025-12-16T14:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.505951 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.505980 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:48 crc kubenswrapper[4728]: E1216 14:57:48.506118 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:48 crc kubenswrapper[4728]: E1216 14:57:48.506247 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.574804 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.574863 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.574880 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.574904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.574922 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:48Z","lastTransitionTime":"2025-12-16T14:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.677978 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.678020 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.678030 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.678046 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.678057 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:48Z","lastTransitionTime":"2025-12-16T14:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.781196 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.781254 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.781271 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.781294 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.781311 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:48Z","lastTransitionTime":"2025-12-16T14:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.884393 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.884483 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.884501 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.884525 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.884543 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:48Z","lastTransitionTime":"2025-12-16T14:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.987477 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.987560 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.987577 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.987602 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:48 crc kubenswrapper[4728]: I1216 14:57:48.987619 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:48Z","lastTransitionTime":"2025-12-16T14:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.090044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.090086 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.090096 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.090112 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.090123 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.192256 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.192338 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.192359 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.192383 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.192400 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.266632 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.267019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.267169 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.267312 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.267497 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: E1216 14:57:49.290191 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.295922 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.295977 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.295995 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.296017 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.296034 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: E1216 14:57:49.316072 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.321032 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.321101 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.321122 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.321153 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.321172 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: E1216 14:57:49.341846 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.346855 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.346927 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.346951 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.346978 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.346999 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: E1216 14:57:49.367082 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.372093 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.372161 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.372180 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.372207 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.372226 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: E1216 14:57:49.392513 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: E1216 14:57:49.392756 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.394874 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.394949 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.394974 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.395002 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.395022 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.497496 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.497600 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.497629 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.497661 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.497686 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.505493 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.505551 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:49 crc kubenswrapper[4728]: E1216 14:57:49.505728 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:49 crc kubenswrapper[4728]: E1216 14:57:49.505924 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.523881 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.540675 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.570287 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.591447 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.599905 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.600278 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.600503 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.600774 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.600945 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.613340 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.635982 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.668113 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.684662 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.704269 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.704364 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.704460 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.704487 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.704521 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.704545 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.723334 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.738988 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.761261 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.788114 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.806926 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.806980 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.806997 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.807022 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.807072 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.815792 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:37.472335 6188 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:37.472436 6188 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:37.472449 6188 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:37.472522 6188 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:37.472549 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:37.472565 6188 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:57:37.472613 6188 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:37.472618 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:37.472628 6188 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:37.472637 6188 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:57:37.472649 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:37.472649 6188 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:37.472672 6188 factory.go:656] Stopping watch factory\\\\nI1216 14:57:37.472677 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:37.472690 6188 ovnkube.go:599] Stopped ovnkube\\\\nI1216 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.841038 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.860297 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.876571 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:49Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.909160 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.909328 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.909348 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.909373 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:49 crc kubenswrapper[4728]: I1216 14:57:49.909447 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:49Z","lastTransitionTime":"2025-12-16T14:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.012903 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.012984 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.013006 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.013037 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.013059 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:50Z","lastTransitionTime":"2025-12-16T14:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.116012 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.116076 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.116093 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.116117 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.116135 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:50Z","lastTransitionTime":"2025-12-16T14:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.218628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.218688 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.218710 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.218741 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.218764 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:50Z","lastTransitionTime":"2025-12-16T14:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.290501 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.290720 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:58:22.290685632 +0000 UTC m=+83.130864656 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.321823 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.321887 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.321910 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.321942 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.321964 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:50Z","lastTransitionTime":"2025-12-16T14:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.392248 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.392331 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.392393 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.392461 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392608 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392617 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392682 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392694 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392711 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392629 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392782 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392803 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392684 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:58:22.392659944 +0000 UTC m=+83.232838968 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392865 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:58:22.392830588 +0000 UTC m=+83.233009692 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392898 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:58:22.39288151 +0000 UTC m=+83.233060644 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.392947 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:58:22.392931241 +0000 UTC m=+83.233110385 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.424595 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.424638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.424670 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.424696 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.424710 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:50Z","lastTransitionTime":"2025-12-16T14:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.505695 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.505704 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.505932 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:50 crc kubenswrapper[4728]: E1216 14:57:50.506048 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.527587 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.527641 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.527656 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.527678 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.527692 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:50Z","lastTransitionTime":"2025-12-16T14:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.631280 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.631341 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.631356 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.631378 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.631396 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:50Z","lastTransitionTime":"2025-12-16T14:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.735153 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.735219 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.735237 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.735262 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.735280 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:50Z","lastTransitionTime":"2025-12-16T14:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.845244 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.845318 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.845341 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.845369 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.845391 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:50Z","lastTransitionTime":"2025-12-16T14:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.948296 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.948387 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.948451 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.948477 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:50 crc kubenswrapper[4728]: I1216 14:57:50.948498 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:50Z","lastTransitionTime":"2025-12-16T14:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.051370 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.051467 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.051492 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.051522 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.051543 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:51Z","lastTransitionTime":"2025-12-16T14:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.154865 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.154921 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.154943 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.154971 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.154990 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:51Z","lastTransitionTime":"2025-12-16T14:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.258216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.258288 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.258313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.258342 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.258364 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:51Z","lastTransitionTime":"2025-12-16T14:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.361308 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.361784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.361967 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.362157 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.362331 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:51Z","lastTransitionTime":"2025-12-16T14:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.465652 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.466350 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.466396 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.466511 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.466533 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:51Z","lastTransitionTime":"2025-12-16T14:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.505377 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.505567 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:51 crc kubenswrapper[4728]: E1216 14:57:51.505807 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:51 crc kubenswrapper[4728]: E1216 14:57:51.505977 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.569322 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.569383 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.569400 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.569453 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.569471 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:51Z","lastTransitionTime":"2025-12-16T14:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.672304 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.672354 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.672370 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.672393 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.672439 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:51Z","lastTransitionTime":"2025-12-16T14:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.775451 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.775516 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.775532 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.775571 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.775587 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:51Z","lastTransitionTime":"2025-12-16T14:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.878543 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.878603 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.878621 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.878646 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.878666 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:51Z","lastTransitionTime":"2025-12-16T14:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.981720 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.981788 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.981810 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.981836 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:51 crc kubenswrapper[4728]: I1216 14:57:51.981856 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:51Z","lastTransitionTime":"2025-12-16T14:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.012042 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.028183 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.036483 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.055362 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.071394 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.085195 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.085233 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.085250 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.085271 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.085287 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:52Z","lastTransitionTime":"2025-12-16T14:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.089679 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.111956 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.139108 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.160950 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.179283 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.188170 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.188214 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.188262 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.188286 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.188337 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:52Z","lastTransitionTime":"2025-12-16T14:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.205886 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.222867 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.240048 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.258382 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:37.472335 6188 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:37.472436 6188 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:37.472449 6188 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:37.472522 6188 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:37.472549 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:37.472565 6188 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:57:37.472613 6188 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:37.472618 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:37.472628 6188 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:37.472637 6188 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:57:37.472649 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:37.472649 6188 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:37.472672 6188 factory.go:656] Stopping watch factory\\\\nI1216 14:57:37.472677 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:37.472690 6188 ovnkube.go:599] Stopped ovnkube\\\\nI1216 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.276189 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.291980 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.292205 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.292477 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.292671 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.292851 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:52Z","lastTransitionTime":"2025-12-16T14:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.297796 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.311123 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.324996 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.364662 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:52Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.396502 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.396549 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.396566 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.396592 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.396611 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:52Z","lastTransitionTime":"2025-12-16T14:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.499653 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.499716 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.499733 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.499759 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.499776 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:52Z","lastTransitionTime":"2025-12-16T14:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.505947 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.505999 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:52 crc kubenswrapper[4728]: E1216 14:57:52.506095 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:52 crc kubenswrapper[4728]: E1216 14:57:52.506215 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.603303 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.603369 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.603385 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.603436 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.603455 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:52Z","lastTransitionTime":"2025-12-16T14:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.706567 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.706633 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.706656 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.706685 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.706707 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:52Z","lastTransitionTime":"2025-12-16T14:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.809515 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.809585 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.809612 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.809645 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.809669 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:52Z","lastTransitionTime":"2025-12-16T14:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.911865 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.911908 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.911918 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.911939 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:52 crc kubenswrapper[4728]: I1216 14:57:52.911950 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:52Z","lastTransitionTime":"2025-12-16T14:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.015204 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.015262 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.015284 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.015314 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.015337 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:53Z","lastTransitionTime":"2025-12-16T14:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.118324 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.118431 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.118449 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.118477 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.118495 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:53Z","lastTransitionTime":"2025-12-16T14:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.221463 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.221526 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.221542 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.221571 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.221594 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:53Z","lastTransitionTime":"2025-12-16T14:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.324781 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.324851 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.324868 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.324892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.324941 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:53Z","lastTransitionTime":"2025-12-16T14:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.428454 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.428524 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.428548 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.428577 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.428601 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:53Z","lastTransitionTime":"2025-12-16T14:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.506508 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.506664 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:53 crc kubenswrapper[4728]: E1216 14:57:53.506873 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:53 crc kubenswrapper[4728]: E1216 14:57:53.507672 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.507927 4728 scope.go:117] "RemoveContainer" containerID="70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.531068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.531140 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.531165 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.531193 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.531214 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:53Z","lastTransitionTime":"2025-12-16T14:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.634991 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.635484 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.635502 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.635527 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.635545 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:53Z","lastTransitionTime":"2025-12-16T14:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.739451 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.739519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.739547 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.739578 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.739606 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:53Z","lastTransitionTime":"2025-12-16T14:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.842892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.842956 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.842974 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.843000 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.843018 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:53Z","lastTransitionTime":"2025-12-16T14:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.900059 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/1.log" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.904470 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.905420 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.934909 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:53 crc kubenswrapper[4728]: E1216 14:57:53.935112 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:53 crc kubenswrapper[4728]: E1216 14:57:53.935200 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs podName:d13ff897-af48-416f-ba3f-44f7e4344a75 nodeName:}" failed. No retries permitted until 2025-12-16 14:58:09.935176811 +0000 UTC m=+70.775355835 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs") pod "network-metrics-daemon-kjxbh" (UID: "d13ff897-af48-416f-ba3f-44f7e4344a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.937946 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:53Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.946352 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.946400 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.946442 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.946474 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.946493 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:53Z","lastTransitionTime":"2025-12-16T14:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.965621 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:53Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:53 crc kubenswrapper[4728]: I1216 14:57:53.990762 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:53Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.019143 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.030336 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.041340 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.048460 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.048518 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.048554 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.048575 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.048590 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:54Z","lastTransitionTime":"2025-12-16T14:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.060351 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.075318 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.088850 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.103747 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.118922 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10211d5c-e094-410e-9995-070bc6d4b926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.130397 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.145274 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.150619 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.150740 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.150818 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.150908 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.150992 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:54Z","lastTransitionTime":"2025-12-16T14:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.163761 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:37.472335 6188 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:37.472436 6188 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:37.472449 6188 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:37.472522 6188 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:37.472549 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:37.472565 6188 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:57:37.472613 6188 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:37.472618 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:37.472628 6188 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:37.472637 6188 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:57:37.472649 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:37.472649 6188 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:37.472672 6188 factory.go:656] Stopping watch factory\\\\nI1216 14:57:37.472677 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:37.472690 6188 ovnkube.go:599] Stopped ovnkube\\\\nI1216 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.177300 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.192764 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.208905 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.227627 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.253340 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.253379 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.253425 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.253438 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.253447 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:54Z","lastTransitionTime":"2025-12-16T14:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.355342 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.355617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.355713 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.355820 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.356004 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:54Z","lastTransitionTime":"2025-12-16T14:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.460160 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.460476 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.460586 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.460688 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.460779 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:54Z","lastTransitionTime":"2025-12-16T14:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.506005 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:54 crc kubenswrapper[4728]: E1216 14:57:54.506182 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.506438 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:54 crc kubenswrapper[4728]: E1216 14:57:54.506661 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.562990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.563050 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.563067 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.563089 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.563107 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:54Z","lastTransitionTime":"2025-12-16T14:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.666027 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.666111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.666137 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.666224 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.666261 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:54Z","lastTransitionTime":"2025-12-16T14:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.769510 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.769564 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.769582 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.769608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.769625 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:54Z","lastTransitionTime":"2025-12-16T14:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.872259 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.872309 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.872327 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.872349 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.872366 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:54Z","lastTransitionTime":"2025-12-16T14:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.911700 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/2.log" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.912947 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/1.log" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.920625 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458" exitCode=1 Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.920686 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458"} Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.920730 4728 scope.go:117] "RemoveContainer" containerID="70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.921974 4728 scope.go:117] "RemoveContainer" containerID="bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458" Dec 16 14:57:54 crc kubenswrapper[4728]: E1216 14:57:54.922456 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.946650 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.969668 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.975376 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.975463 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.975481 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.975503 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.975521 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:54Z","lastTransitionTime":"2025-12-16T14:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:54 crc kubenswrapper[4728]: I1216 14:57:54.988905 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:54Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.008380 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.019670 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.035079 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.066630 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.078608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.078678 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.078704 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.078738 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.078764 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:55Z","lastTransitionTime":"2025-12-16T14:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.087067 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.105507 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.124671 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.142957 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10211d5c-e094-410e-9995-070bc6d4b926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.162573 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.182188 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.182243 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.182259 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.182285 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.182302 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:55Z","lastTransitionTime":"2025-12-16T14:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.186781 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.216653 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70075057682f872800027cfe00497b5bf81336b726d925c0ba0edf3bd563b5af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:57:37.472335 6188 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1216 14:57:37.472436 6188 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1216 14:57:37.472449 6188 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1216 14:57:37.472522 6188 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:57:37.472549 6188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1216 14:57:37.472565 6188 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1216 14:57:37.472613 6188 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1216 14:57:37.472618 6188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1216 14:57:37.472628 6188 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1216 14:57:37.472637 6188 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:57:37.472649 6188 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:57:37.472649 6188 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1216 14:57:37.472672 6188 factory.go:656] Stopping watch factory\\\\nI1216 14:57:37.472677 6188 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:57:37.472690 6188 ovnkube.go:599] Stopped ovnkube\\\\nI1216 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:54Z\\\",\\\"message\\\":\\\"resses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.253],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 14:57:54.404005 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404010 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1216 14:57:54.404015 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404022 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1216 14:57:54.404028 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1216 14:57:54.404029 6389 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nI1216 14:57:54.403887 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-hbwhm\\\\nI1216 14:57:54.404045 6389 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.234446 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.259575 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.277267 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.285598 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.285661 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.285683 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.285711 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.285732 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:55Z","lastTransitionTime":"2025-12-16T14:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.292973 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.388691 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.388738 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.388756 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.388777 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.388793 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:55Z","lastTransitionTime":"2025-12-16T14:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.491863 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.492175 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.492315 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.492478 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.492598 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:55Z","lastTransitionTime":"2025-12-16T14:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.506219 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.506236 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:55 crc kubenswrapper[4728]: E1216 14:57:55.506378 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:55 crc kubenswrapper[4728]: E1216 14:57:55.506516 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.595319 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.595378 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.595400 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.595466 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.595488 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:55Z","lastTransitionTime":"2025-12-16T14:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.698627 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.698680 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.698699 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.698722 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.698739 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:55Z","lastTransitionTime":"2025-12-16T14:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.802142 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.802180 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.802189 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.802219 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.802227 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:55Z","lastTransitionTime":"2025-12-16T14:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.905175 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.905238 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.905254 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.905277 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.905295 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:55Z","lastTransitionTime":"2025-12-16T14:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.927362 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/2.log" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.932444 4728 scope.go:117] "RemoveContainer" containerID="bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458" Dec 16 14:57:55 crc kubenswrapper[4728]: E1216 14:57:55.932690 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.949618 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10211d5c-e094-410e-9995-070bc6d4b926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.971690 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:55 crc kubenswrapper[4728]: I1216 14:57:55.994736 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:55Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.008699 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.008777 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.008800 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.008831 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.008852 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:56Z","lastTransitionTime":"2025-12-16T14:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.028589 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:54Z\\\",\\\"message\\\":\\\"resses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.253],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 14:57:54.404005 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404010 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1216 14:57:54.404015 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404022 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1216 14:57:54.404028 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1216 14:57:54.404029 6389 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nI1216 14:57:54.403887 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-hbwhm\\\\nI1216 14:57:54.404045 6389 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.048726 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.069942 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.081968 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.092032 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.106849 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.111849 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.112004 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.112024 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.112109 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.112135 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:56Z","lastTransitionTime":"2025-12-16T14:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.120635 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.138018 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.151524 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.165551 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.179309 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.212500 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.214735 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.214897 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.215020 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.215174 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.215296 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:56Z","lastTransitionTime":"2025-12-16T14:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.229519 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.243686 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.262597 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:56Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.317361 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.317655 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.317740 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.317835 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.317920 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:56Z","lastTransitionTime":"2025-12-16T14:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.420964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.421023 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.421041 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.421064 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.421083 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:56Z","lastTransitionTime":"2025-12-16T14:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.506043 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.506090 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:56 crc kubenswrapper[4728]: E1216 14:57:56.506203 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:56 crc kubenswrapper[4728]: E1216 14:57:56.506376 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.523988 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.524034 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.524050 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.524071 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.524087 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:56Z","lastTransitionTime":"2025-12-16T14:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.627271 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.627319 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.627331 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.627349 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.627360 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:56Z","lastTransitionTime":"2025-12-16T14:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.729784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.729849 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.729866 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.729889 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.729908 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:56Z","lastTransitionTime":"2025-12-16T14:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.832837 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.832902 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.832918 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.832944 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.832962 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:56Z","lastTransitionTime":"2025-12-16T14:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.935853 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.935907 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.935928 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.935956 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:56 crc kubenswrapper[4728]: I1216 14:57:56.935980 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:56Z","lastTransitionTime":"2025-12-16T14:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.040143 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.040211 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.040232 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.040259 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.040286 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:57Z","lastTransitionTime":"2025-12-16T14:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.143233 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.143315 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.143338 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.143366 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.143388 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:57Z","lastTransitionTime":"2025-12-16T14:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.245845 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.245901 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.245923 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.245952 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.245973 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:57Z","lastTransitionTime":"2025-12-16T14:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.348181 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.348245 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.348261 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.348282 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.348298 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:57Z","lastTransitionTime":"2025-12-16T14:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.451452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.451505 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.451522 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.451544 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.451561 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:57Z","lastTransitionTime":"2025-12-16T14:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.505386 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.505428 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:57 crc kubenswrapper[4728]: E1216 14:57:57.505593 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:57 crc kubenswrapper[4728]: E1216 14:57:57.505712 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.554041 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.554085 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.554102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.554123 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.554141 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:57Z","lastTransitionTime":"2025-12-16T14:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.657484 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.657552 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.657568 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.657593 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.657611 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:57Z","lastTransitionTime":"2025-12-16T14:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.759645 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.759708 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.759727 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.759752 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.759770 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:57Z","lastTransitionTime":"2025-12-16T14:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.862391 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.862476 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.862495 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.862519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.862537 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:57Z","lastTransitionTime":"2025-12-16T14:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.965146 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.965215 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.965233 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.965258 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:57 crc kubenswrapper[4728]: I1216 14:57:57.965275 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:57Z","lastTransitionTime":"2025-12-16T14:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.067959 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.068023 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.068042 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.068067 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.068084 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:58Z","lastTransitionTime":"2025-12-16T14:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.171001 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.171058 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.171077 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.171103 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.171119 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:58Z","lastTransitionTime":"2025-12-16T14:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.274029 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.274083 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.274101 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.274124 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.274140 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:58Z","lastTransitionTime":"2025-12-16T14:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.376836 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.376902 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.376920 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.376945 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.376969 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:58Z","lastTransitionTime":"2025-12-16T14:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.480127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.480206 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.480229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.480262 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.480282 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:58Z","lastTransitionTime":"2025-12-16T14:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.505792 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.505902 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:57:58 crc kubenswrapper[4728]: E1216 14:57:58.506043 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:57:58 crc kubenswrapper[4728]: E1216 14:57:58.506153 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.583456 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.583580 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.583607 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.583632 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.583649 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:58Z","lastTransitionTime":"2025-12-16T14:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.686202 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.686268 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.686290 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.686318 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.686343 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:58Z","lastTransitionTime":"2025-12-16T14:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.790364 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.790461 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.790481 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.790505 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.790521 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:58Z","lastTransitionTime":"2025-12-16T14:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.895283 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.895351 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.895368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.895392 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.895431 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:58Z","lastTransitionTime":"2025-12-16T14:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.997909 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.997953 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.997963 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.997979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:58 crc kubenswrapper[4728]: I1216 14:57:58.997990 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:58Z","lastTransitionTime":"2025-12-16T14:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.100902 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.100940 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.100948 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.100962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.100972 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.203752 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.203817 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.203840 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.203863 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.203880 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.306856 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.306904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.306920 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.306942 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.306963 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.410270 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.410346 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.410371 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.410400 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.410461 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.506099 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.506243 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:57:59 crc kubenswrapper[4728]: E1216 14:57:59.506338 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:57:59 crc kubenswrapper[4728]: E1216 14:57:59.506465 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.513247 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.513363 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.513393 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.513495 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.513572 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.524792 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.542921 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.562143 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.580676 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.596293 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.614644 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.618007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.618068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.618087 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.618110 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.618128 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.630586 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.651560 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.676302 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.693605 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.709629 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.713885 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.713983 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.714054 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.714121 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.714181 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: E1216 14:57:59.728035 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.731984 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.732013 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.732024 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.732042 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.732056 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.739943 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: E1216 14:57:59.748549 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.760891 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.761197 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.761392 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.761488 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.761576 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.761678 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: E1216 14:57:59.779378 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.782548 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.783736 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.783793 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.783810 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.783832 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.783847 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: E1216 14:57:59.802950 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.810068 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:54Z\\\",\\\"message\\\":\\\"resses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.253],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 14:57:54.404005 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404010 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1216 14:57:54.404015 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404022 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1216 14:57:54.404028 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1216 14:57:54.404029 6389 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nI1216 14:57:54.403887 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-hbwhm\\\\nI1216 14:57:54.404045 6389 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.811740 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.811803 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.811820 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.811845 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.811862 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.822768 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: E1216 14:57:59.831620 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: E1216 14:57:59.831765 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.833582 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.833648 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.833666 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.833692 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.833710 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.837872 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10211d5c-e094-410e-9995-070bc6d4b926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.855184 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:57:59Z is after 2025-08-24T17:21:41Z" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.936999 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.937533 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.938030 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.938300 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:57:59 crc kubenswrapper[4728]: I1216 14:57:59.938539 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:57:59Z","lastTransitionTime":"2025-12-16T14:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.041821 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.042167 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.042184 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.042211 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.042229 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:00Z","lastTransitionTime":"2025-12-16T14:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.145681 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.145749 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.145765 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.145790 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.145806 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:00Z","lastTransitionTime":"2025-12-16T14:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.248971 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.249057 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.249081 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.249113 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.249137 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:00Z","lastTransitionTime":"2025-12-16T14:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.351987 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.352099 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.352123 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.352150 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.352174 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:00Z","lastTransitionTime":"2025-12-16T14:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.454552 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.454620 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.454643 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.454668 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.454686 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:00Z","lastTransitionTime":"2025-12-16T14:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.505301 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:00 crc kubenswrapper[4728]: E1216 14:58:00.505508 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.505626 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:00 crc kubenswrapper[4728]: E1216 14:58:00.505784 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.557333 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.557376 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.557492 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.557514 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.557549 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:00Z","lastTransitionTime":"2025-12-16T14:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.660611 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.660664 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.660684 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.660708 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.660725 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:00Z","lastTransitionTime":"2025-12-16T14:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.762824 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.763458 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.763654 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.763867 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.764073 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:00Z","lastTransitionTime":"2025-12-16T14:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.867508 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.867576 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.867602 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.867628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.867648 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:00Z","lastTransitionTime":"2025-12-16T14:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.970679 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.970745 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.970763 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.970789 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:00 crc kubenswrapper[4728]: I1216 14:58:00.970809 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:00Z","lastTransitionTime":"2025-12-16T14:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.072967 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.073058 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.073077 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.073098 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.073114 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:01Z","lastTransitionTime":"2025-12-16T14:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.177026 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.177128 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.177145 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.177169 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.177187 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:01Z","lastTransitionTime":"2025-12-16T14:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.280529 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.280590 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.280607 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.280632 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.280650 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:01Z","lastTransitionTime":"2025-12-16T14:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.384386 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.384479 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.384496 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.384523 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.384542 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:01Z","lastTransitionTime":"2025-12-16T14:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.487707 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.487752 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.487760 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.487775 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.487785 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:01Z","lastTransitionTime":"2025-12-16T14:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.506372 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:01 crc kubenswrapper[4728]: E1216 14:58:01.506491 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.506531 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:01 crc kubenswrapper[4728]: E1216 14:58:01.506722 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.590551 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.590623 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.590640 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.590664 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.590682 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:01Z","lastTransitionTime":"2025-12-16T14:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.694222 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.694289 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.694312 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.694339 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.694362 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:01Z","lastTransitionTime":"2025-12-16T14:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.797786 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.797826 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.797838 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.797854 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.797866 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:01Z","lastTransitionTime":"2025-12-16T14:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.900863 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.900923 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.900941 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.900962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:01 crc kubenswrapper[4728]: I1216 14:58:01.900978 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:01Z","lastTransitionTime":"2025-12-16T14:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.003353 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.003513 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.003589 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.003621 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.003642 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:02Z","lastTransitionTime":"2025-12-16T14:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.106906 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.106938 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.106950 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.106965 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.106976 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:02Z","lastTransitionTime":"2025-12-16T14:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.209056 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.209097 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.209109 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.209125 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.209136 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:02Z","lastTransitionTime":"2025-12-16T14:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.311072 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.311105 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.311114 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.311127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.311138 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:02Z","lastTransitionTime":"2025-12-16T14:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.412782 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.412822 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.412836 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.412851 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.412862 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:02Z","lastTransitionTime":"2025-12-16T14:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.506112 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.506138 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:02 crc kubenswrapper[4728]: E1216 14:58:02.506238 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:02 crc kubenswrapper[4728]: E1216 14:58:02.506342 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.515444 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.515509 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.515591 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.515688 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.515722 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:02Z","lastTransitionTime":"2025-12-16T14:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.617834 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.617891 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.617901 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.618313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.618447 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:02Z","lastTransitionTime":"2025-12-16T14:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.722049 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.722112 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.722134 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.722161 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.722183 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:02Z","lastTransitionTime":"2025-12-16T14:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.824614 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.824656 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.824667 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.824682 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.824697 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:02Z","lastTransitionTime":"2025-12-16T14:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.927133 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.927204 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.927227 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.927255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:02 crc kubenswrapper[4728]: I1216 14:58:02.927278 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:02Z","lastTransitionTime":"2025-12-16T14:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.029109 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.029150 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.029162 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.029178 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.029190 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:03Z","lastTransitionTime":"2025-12-16T14:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.131307 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.131375 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.131393 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.131425 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.131437 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:03Z","lastTransitionTime":"2025-12-16T14:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.234553 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.234578 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.234586 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.234597 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.234605 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:03Z","lastTransitionTime":"2025-12-16T14:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.337465 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.337786 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.337965 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.338165 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.338399 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:03Z","lastTransitionTime":"2025-12-16T14:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.441265 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.441544 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.441784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.441988 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.442168 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:03Z","lastTransitionTime":"2025-12-16T14:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.505881 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.505957 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:03 crc kubenswrapper[4728]: E1216 14:58:03.506027 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:03 crc kubenswrapper[4728]: E1216 14:58:03.506123 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.545145 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.545206 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.545228 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.545256 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.545277 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:03Z","lastTransitionTime":"2025-12-16T14:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.648216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.648484 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.648697 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.648783 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.648865 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:03Z","lastTransitionTime":"2025-12-16T14:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.751607 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.751666 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.751684 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.751706 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.751724 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:03Z","lastTransitionTime":"2025-12-16T14:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.853876 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.855780 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.855932 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.856070 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.856213 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:03Z","lastTransitionTime":"2025-12-16T14:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.958313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.958355 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.958368 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.958385 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:03 crc kubenswrapper[4728]: I1216 14:58:03.958397 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:03Z","lastTransitionTime":"2025-12-16T14:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.087950 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.087986 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.087996 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.088009 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.088017 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:04Z","lastTransitionTime":"2025-12-16T14:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.189897 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.189942 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.189955 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.189972 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.189984 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:04Z","lastTransitionTime":"2025-12-16T14:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.292084 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.292116 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.292126 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.292139 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.292150 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:04Z","lastTransitionTime":"2025-12-16T14:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.394077 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.394127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.394135 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.394150 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.394160 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:04Z","lastTransitionTime":"2025-12-16T14:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.497445 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.497502 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.497519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.497541 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.497556 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:04Z","lastTransitionTime":"2025-12-16T14:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.505732 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.506053 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:04 crc kubenswrapper[4728]: E1216 14:58:04.506237 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:04 crc kubenswrapper[4728]: E1216 14:58:04.506644 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.599760 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.599810 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.599828 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.599852 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.599871 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:04Z","lastTransitionTime":"2025-12-16T14:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.702054 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.702098 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.702119 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.702143 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.702160 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:04Z","lastTransitionTime":"2025-12-16T14:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.805336 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.805432 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.805454 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.805479 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.805496 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:04Z","lastTransitionTime":"2025-12-16T14:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.908240 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.908886 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.908993 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.909090 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:04 crc kubenswrapper[4728]: I1216 14:58:04.909178 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:04Z","lastTransitionTime":"2025-12-16T14:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.011730 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.011782 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.011795 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.011813 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.011825 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:05Z","lastTransitionTime":"2025-12-16T14:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.114840 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.115095 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.115203 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.115294 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.115393 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:05Z","lastTransitionTime":"2025-12-16T14:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.218256 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.218314 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.218328 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.218350 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.218364 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:05Z","lastTransitionTime":"2025-12-16T14:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.320246 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.320292 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.320303 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.320317 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.320327 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:05Z","lastTransitionTime":"2025-12-16T14:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.422609 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.422705 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.422718 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.422738 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.422749 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:05Z","lastTransitionTime":"2025-12-16T14:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.505967 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.506002 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:05 crc kubenswrapper[4728]: E1216 14:58:05.506128 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:05 crc kubenswrapper[4728]: E1216 14:58:05.506228 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.525202 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.525242 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.525253 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.525266 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.525280 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:05Z","lastTransitionTime":"2025-12-16T14:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.628807 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.628894 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.628919 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.628957 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.628975 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:05Z","lastTransitionTime":"2025-12-16T14:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.731689 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.731737 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.731751 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.731773 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.731788 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:05Z","lastTransitionTime":"2025-12-16T14:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.834597 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.834671 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.834689 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.834718 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.834740 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:05Z","lastTransitionTime":"2025-12-16T14:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.937758 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.937879 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.937904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.937932 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:05 crc kubenswrapper[4728]: I1216 14:58:05.937954 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:05Z","lastTransitionTime":"2025-12-16T14:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.040296 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.040343 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.040354 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.040370 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.040381 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:06Z","lastTransitionTime":"2025-12-16T14:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.142635 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.142720 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.142754 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.142784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.142805 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:06Z","lastTransitionTime":"2025-12-16T14:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.245641 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.245706 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.245717 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.245733 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.245744 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:06Z","lastTransitionTime":"2025-12-16T14:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.348893 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.348933 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.348945 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.348962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.348973 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:06Z","lastTransitionTime":"2025-12-16T14:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.451681 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.451747 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.451764 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.451790 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.451839 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:06Z","lastTransitionTime":"2025-12-16T14:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.506190 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:06 crc kubenswrapper[4728]: E1216 14:58:06.506339 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.506211 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:06 crc kubenswrapper[4728]: E1216 14:58:06.506658 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.554634 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.554677 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.554697 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.554724 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.554746 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:06Z","lastTransitionTime":"2025-12-16T14:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.656988 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.657056 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.657076 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.657101 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.657117 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:06Z","lastTransitionTime":"2025-12-16T14:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.759038 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.759102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.759206 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.759233 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.759261 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:06Z","lastTransitionTime":"2025-12-16T14:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.861855 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.861892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.861901 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.861915 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.861944 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:06Z","lastTransitionTime":"2025-12-16T14:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.964132 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.964167 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.964176 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.964188 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:06 crc kubenswrapper[4728]: I1216 14:58:06.964196 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:06Z","lastTransitionTime":"2025-12-16T14:58:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.067072 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.067110 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.067119 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.067133 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.067144 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:07Z","lastTransitionTime":"2025-12-16T14:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.169301 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.169341 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.169349 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.169363 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.169374 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:07Z","lastTransitionTime":"2025-12-16T14:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.271699 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.271735 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.271743 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.271756 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.271765 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:07Z","lastTransitionTime":"2025-12-16T14:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.374439 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.374482 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.374490 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.374505 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.374515 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:07Z","lastTransitionTime":"2025-12-16T14:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.477102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.477163 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.477180 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.477207 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.477230 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:07Z","lastTransitionTime":"2025-12-16T14:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.506212 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.506258 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:07 crc kubenswrapper[4728]: E1216 14:58:07.506540 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:07 crc kubenswrapper[4728]: E1216 14:58:07.506383 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.579177 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.579211 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.579222 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.579235 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.579243 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:07Z","lastTransitionTime":"2025-12-16T14:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.682190 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.682229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.682237 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.682252 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.682260 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:07Z","lastTransitionTime":"2025-12-16T14:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.784758 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.784912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.784931 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.784954 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.784970 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:07Z","lastTransitionTime":"2025-12-16T14:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.887402 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.887452 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.887459 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.887476 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.887488 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:07Z","lastTransitionTime":"2025-12-16T14:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.989102 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.989125 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.989133 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.989145 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:07 crc kubenswrapper[4728]: I1216 14:58:07.989154 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:07Z","lastTransitionTime":"2025-12-16T14:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.091298 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.091344 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.091355 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.091371 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.091382 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:08Z","lastTransitionTime":"2025-12-16T14:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.193515 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.193541 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.193550 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.193562 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.193571 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:08Z","lastTransitionTime":"2025-12-16T14:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.295892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.295936 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.295947 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.295963 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.295973 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:08Z","lastTransitionTime":"2025-12-16T14:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.398609 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.398674 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.398687 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.398703 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.398715 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:08Z","lastTransitionTime":"2025-12-16T14:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.501488 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.501524 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.501531 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.501545 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.501554 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:08Z","lastTransitionTime":"2025-12-16T14:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.505912 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.505927 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:08 crc kubenswrapper[4728]: E1216 14:58:08.506028 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:08 crc kubenswrapper[4728]: E1216 14:58:08.506163 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.603875 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.603979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.604003 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.604039 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.604063 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:08Z","lastTransitionTime":"2025-12-16T14:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.707027 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.707068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.707080 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.707096 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.707109 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:08Z","lastTransitionTime":"2025-12-16T14:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.814573 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.814612 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.814624 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.814775 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.814791 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:08Z","lastTransitionTime":"2025-12-16T14:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.918219 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.918256 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.918266 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.918280 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:08 crc kubenswrapper[4728]: I1216 14:58:08.918294 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:08Z","lastTransitionTime":"2025-12-16T14:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.020519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.020600 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.020623 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.020651 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.020669 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:09Z","lastTransitionTime":"2025-12-16T14:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.123057 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.123111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.123121 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.123138 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.123149 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:09Z","lastTransitionTime":"2025-12-16T14:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.226027 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.226061 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.226072 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.226087 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.226096 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:09Z","lastTransitionTime":"2025-12-16T14:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.328705 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.328762 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.328782 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.328813 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.328837 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:09Z","lastTransitionTime":"2025-12-16T14:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.431484 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.431519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.431530 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.431546 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.431557 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:09Z","lastTransitionTime":"2025-12-16T14:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.505644 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.505736 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:09 crc kubenswrapper[4728]: E1216 14:58:09.505825 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:09 crc kubenswrapper[4728]: E1216 14:58:09.506445 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.506957 4728 scope.go:117] "RemoveContainer" containerID="bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458" Dec 16 14:58:09 crc kubenswrapper[4728]: E1216 14:58:09.513673 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.533221 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.533267 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.533277 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.533291 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.533301 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:09Z","lastTransitionTime":"2025-12-16T14:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.539029 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:54Z\\\",\\\"message\\\":\\\"resses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.253],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 14:57:54.404005 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404010 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1216 14:57:54.404015 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404022 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1216 14:57:54.404028 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1216 14:57:54.404029 6389 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nI1216 14:57:54.403887 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-hbwhm\\\\nI1216 14:57:54.404045 6389 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.553491 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.568164 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10211d5c-e094-410e-9995-070bc6d4b926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.584845 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.600866 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.611566 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.624898 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.636007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.636067 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.636084 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.636110 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.636127 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:09Z","lastTransitionTime":"2025-12-16T14:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.640683 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.653514 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.662530 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.674676 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.685105 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.699034 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.708117 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.718869 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.738639 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.738828 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.738939 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.739044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.739153 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:09Z","lastTransitionTime":"2025-12-16T14:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.746028 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.760337 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.777516 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:09Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.841915 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.841960 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.841972 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.841987 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.841996 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:09Z","lastTransitionTime":"2025-12-16T14:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.949109 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.949153 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.949166 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.949184 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.949197 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:09Z","lastTransitionTime":"2025-12-16T14:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:09 crc kubenswrapper[4728]: I1216 14:58:09.952022 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:09 crc kubenswrapper[4728]: E1216 14:58:09.952124 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:58:09 crc kubenswrapper[4728]: E1216 14:58:09.952184 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs podName:d13ff897-af48-416f-ba3f-44f7e4344a75 nodeName:}" failed. No retries permitted until 2025-12-16 14:58:41.952167782 +0000 UTC m=+102.792346766 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs") pod "network-metrics-daemon-kjxbh" (UID: "d13ff897-af48-416f-ba3f-44f7e4344a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.051903 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.051967 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.051984 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.052007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.052023 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.063149 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.063270 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.063356 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.063449 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.063529 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: E1216 14:58:10.078174 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.081013 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.081042 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.081053 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.081068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.081078 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: E1216 14:58:10.091168 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.094561 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.094588 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.094596 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.094606 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.094615 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: E1216 14:58:10.104881 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.108019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.108146 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.108223 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.108316 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.108423 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: E1216 14:58:10.119562 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.122943 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.123097 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.123183 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.123270 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.123371 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: E1216 14:58:10.140612 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:10Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:10 crc kubenswrapper[4728]: E1216 14:58:10.141050 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.154563 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.154630 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.154649 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.154673 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.154690 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.257365 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.257443 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.257461 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.257488 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.257508 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.360031 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.360083 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.360096 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.360111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.360122 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.463137 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.463169 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.463182 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.463198 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.463210 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.506007 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.506050 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:10 crc kubenswrapper[4728]: E1216 14:58:10.506163 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:10 crc kubenswrapper[4728]: E1216 14:58:10.506259 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.567445 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.567521 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.567533 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.567550 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.567566 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.670580 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.670660 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.670679 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.670707 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.670724 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.773117 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.773350 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.773436 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.773510 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.773585 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.876287 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.876636 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.876778 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.877001 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.877181 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.978387 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdpsg_57f7e48b-7353-469c-ab9d-7f966c08d5f1/kube-multus/0.log" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.978458 4728 generic.go:334] "Generic (PLEG): container finished" podID="57f7e48b-7353-469c-ab9d-7f966c08d5f1" containerID="25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30" exitCode=1 Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.978518 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdpsg" event={"ID":"57f7e48b-7353-469c-ab9d-7f966c08d5f1","Type":"ContainerDied","Data":"25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30"} Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.979532 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.979566 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.979577 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.979592 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.979603 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:10Z","lastTransitionTime":"2025-12-16T14:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:10 crc kubenswrapper[4728]: I1216 14:58:10.980051 4728 scope.go:117] "RemoveContainer" containerID="25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.004021 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.016647 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.025273 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.039585 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.052445 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.064143 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.077856 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.081919 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.081962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.081973 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.081989 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.082000 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:11Z","lastTransitionTime":"2025-12-16T14:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.092743 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.108208 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.131701 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.146555 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.163718 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.178800 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.183898 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.183962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.183981 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.184005 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.184024 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:11Z","lastTransitionTime":"2025-12-16T14:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.194193 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10211d5c-e094-410e-9995-070bc6d4b926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.206585 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"2025-12-16T14:57:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d\\\\n2025-12-16T14:57:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d to /host/opt/cni/bin/\\\\n2025-12-16T14:57:25Z [verbose] multus-daemon started\\\\n2025-12-16T14:57:25Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:58:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.225480 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.246394 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:54Z\\\",\\\"message\\\":\\\"resses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.253],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 14:57:54.404005 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404010 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1216 14:57:54.404015 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404022 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1216 14:57:54.404028 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1216 14:57:54.404029 6389 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nI1216 14:57:54.403887 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-hbwhm\\\\nI1216 14:57:54.404045 6389 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.257915 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.287515 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.287626 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.287649 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.287673 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.287688 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:11Z","lastTransitionTime":"2025-12-16T14:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.390597 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.390642 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.390653 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.390668 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.390679 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:11Z","lastTransitionTime":"2025-12-16T14:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.493011 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.493069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.493085 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.493108 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.493125 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:11Z","lastTransitionTime":"2025-12-16T14:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.506307 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.506327 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:11 crc kubenswrapper[4728]: E1216 14:58:11.506600 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:11 crc kubenswrapper[4728]: E1216 14:58:11.506743 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.524342 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.595705 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.595740 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.595747 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.595760 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.595768 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:11Z","lastTransitionTime":"2025-12-16T14:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.698292 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.698319 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.698327 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.698341 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.698350 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:11Z","lastTransitionTime":"2025-12-16T14:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.803286 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.803338 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.803369 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.803395 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.803436 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:11Z","lastTransitionTime":"2025-12-16T14:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.906061 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.906099 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.906107 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.906120 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.906129 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:11Z","lastTransitionTime":"2025-12-16T14:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.982992 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdpsg_57f7e48b-7353-469c-ab9d-7f966c08d5f1/kube-multus/0.log" Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.983074 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdpsg" event={"ID":"57f7e48b-7353-469c-ab9d-7f966c08d5f1","Type":"ContainerStarted","Data":"1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076"} Dec 16 14:58:11 crc kubenswrapper[4728]: I1216 14:58:11.995500 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:11Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.007085 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.008990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.009167 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.009286 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.009494 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.009648 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:12Z","lastTransitionTime":"2025-12-16T14:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.021358 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529fdb48-e1b7-4a69-a177-19750ae97b6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e18726ab68389800cdf480fc5d00a391d68a7aa384b1656b3fa1bc78e74930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.034395 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.045521 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.060430 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.071872 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.082453 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.101331 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.112206 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.112463 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.112487 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.112496 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.112509 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.112519 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:12Z","lastTransitionTime":"2025-12-16T14:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.124212 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.138071 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.168856 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.183925 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.199287 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.215284 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.215331 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.215348 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.215374 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.215397 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:12Z","lastTransitionTime":"2025-12-16T14:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.227642 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:54Z\\\",\\\"message\\\":\\\"resses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.253],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 14:57:54.404005 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404010 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1216 14:57:54.404015 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404022 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1216 14:57:54.404028 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1216 14:57:54.404029 6389 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nI1216 14:57:54.403887 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-hbwhm\\\\nI1216 14:57:54.404045 6389 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.243770 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.260490 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10211d5c-e094-410e-9995-070bc6d4b926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.273527 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"2025-12-16T14:57:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d\\\\n2025-12-16T14:57:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d to /host/opt/cni/bin/\\\\n2025-12-16T14:57:25Z [verbose] multus-daemon started\\\\n2025-12-16T14:57:25Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:58:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:12Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.317435 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.317461 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.317470 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.317482 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.317490 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:12Z","lastTransitionTime":"2025-12-16T14:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.419439 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.419473 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.419486 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.419520 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.419531 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:12Z","lastTransitionTime":"2025-12-16T14:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.505598 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:12 crc kubenswrapper[4728]: E1216 14:58:12.505722 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.505610 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:12 crc kubenswrapper[4728]: E1216 14:58:12.505902 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.521285 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.521335 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.521354 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.521377 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.521397 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:12Z","lastTransitionTime":"2025-12-16T14:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.624479 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.624507 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.624519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.624532 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.624541 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:12Z","lastTransitionTime":"2025-12-16T14:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.727289 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.727352 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.727370 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.727393 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.727438 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:12Z","lastTransitionTime":"2025-12-16T14:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.829355 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.829387 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.829395 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.829423 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.829431 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:12Z","lastTransitionTime":"2025-12-16T14:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.931007 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.931068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.931091 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.931120 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:12 crc kubenswrapper[4728]: I1216 14:58:12.931143 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:12Z","lastTransitionTime":"2025-12-16T14:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.033696 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.034077 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.034262 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.034565 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.034729 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:13Z","lastTransitionTime":"2025-12-16T14:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.137505 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.138540 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.138585 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.138608 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.138627 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:13Z","lastTransitionTime":"2025-12-16T14:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.241034 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.241087 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.241104 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.241126 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.241144 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:13Z","lastTransitionTime":"2025-12-16T14:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.343539 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.343596 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.343613 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.343637 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.343653 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:13Z","lastTransitionTime":"2025-12-16T14:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.446842 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.446888 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.446904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.446924 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.446940 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:13Z","lastTransitionTime":"2025-12-16T14:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.510516 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:13 crc kubenswrapper[4728]: E1216 14:58:13.510685 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.510956 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:13 crc kubenswrapper[4728]: E1216 14:58:13.511068 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.549723 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.549789 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.549805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.549832 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.549849 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:13Z","lastTransitionTime":"2025-12-16T14:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.653224 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.653279 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.653294 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.653316 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.653332 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:13Z","lastTransitionTime":"2025-12-16T14:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.756189 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.756300 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.756325 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.756352 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.756370 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:13Z","lastTransitionTime":"2025-12-16T14:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.859351 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.859441 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.859465 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.859491 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.859508 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:13Z","lastTransitionTime":"2025-12-16T14:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.962361 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.962443 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.962462 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.962483 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:13 crc kubenswrapper[4728]: I1216 14:58:13.962499 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:13Z","lastTransitionTime":"2025-12-16T14:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.064620 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.064664 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.064681 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.064703 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.064718 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:14Z","lastTransitionTime":"2025-12-16T14:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.167437 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.167473 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.167484 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.167498 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.167509 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:14Z","lastTransitionTime":"2025-12-16T14:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.270439 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.270480 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.270495 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.270517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.270533 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:14Z","lastTransitionTime":"2025-12-16T14:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.373504 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.373556 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.373574 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.373598 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.373615 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:14Z","lastTransitionTime":"2025-12-16T14:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.477106 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.477151 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.477167 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.477188 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.477204 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:14Z","lastTransitionTime":"2025-12-16T14:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.506023 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:14 crc kubenswrapper[4728]: E1216 14:58:14.506165 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.506238 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:14 crc kubenswrapper[4728]: E1216 14:58:14.506318 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.580597 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.580647 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.580662 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.580684 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.580703 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:14Z","lastTransitionTime":"2025-12-16T14:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.683496 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.683583 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.683669 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.683704 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.683726 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:14Z","lastTransitionTime":"2025-12-16T14:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.787389 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.787474 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.787493 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.787517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.787534 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:14Z","lastTransitionTime":"2025-12-16T14:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.890892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.890958 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.890980 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.891008 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.891030 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:14Z","lastTransitionTime":"2025-12-16T14:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.998379 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.998468 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.998486 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.998512 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:14 crc kubenswrapper[4728]: I1216 14:58:14.998529 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:14Z","lastTransitionTime":"2025-12-16T14:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.101862 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.101918 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.101931 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.101949 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.101964 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:15Z","lastTransitionTime":"2025-12-16T14:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.205670 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.205728 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.205743 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.205767 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.205785 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:15Z","lastTransitionTime":"2025-12-16T14:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.308743 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.308805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.308826 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.308854 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.308875 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:15Z","lastTransitionTime":"2025-12-16T14:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.410519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.410601 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.410635 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.410662 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.410683 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:15Z","lastTransitionTime":"2025-12-16T14:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.505856 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.505897 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:15 crc kubenswrapper[4728]: E1216 14:58:15.506083 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:15 crc kubenswrapper[4728]: E1216 14:58:15.506184 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.513695 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.513745 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.513762 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.513784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.513800 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:15Z","lastTransitionTime":"2025-12-16T14:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.617593 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.617659 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.617682 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.617711 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.617733 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:15Z","lastTransitionTime":"2025-12-16T14:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.720250 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.720297 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.720315 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.720338 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.720354 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:15Z","lastTransitionTime":"2025-12-16T14:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.823548 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.823603 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.823619 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.823643 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.823660 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:15Z","lastTransitionTime":"2025-12-16T14:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.926117 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.926189 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.926212 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.926240 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:15 crc kubenswrapper[4728]: I1216 14:58:15.926262 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:15Z","lastTransitionTime":"2025-12-16T14:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.029626 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.029711 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.029735 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.029764 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.029788 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:16Z","lastTransitionTime":"2025-12-16T14:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.132878 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.132956 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.132973 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.132998 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.133015 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:16Z","lastTransitionTime":"2025-12-16T14:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.240506 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.240569 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.240592 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.240623 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.240646 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:16Z","lastTransitionTime":"2025-12-16T14:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.344114 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.344193 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.344216 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.344253 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.344278 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:16Z","lastTransitionTime":"2025-12-16T14:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.447636 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.447693 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.447711 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.447739 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.447760 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:16Z","lastTransitionTime":"2025-12-16T14:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.505853 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.505890 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:16 crc kubenswrapper[4728]: E1216 14:58:16.506083 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:16 crc kubenswrapper[4728]: E1216 14:58:16.506209 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.551543 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.551612 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.551626 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.551648 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.551664 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:16Z","lastTransitionTime":"2025-12-16T14:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.655615 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.655757 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.655823 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.655853 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.655874 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:16Z","lastTransitionTime":"2025-12-16T14:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.759794 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.759855 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.759873 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.759898 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.759914 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:16Z","lastTransitionTime":"2025-12-16T14:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.862696 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.862744 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.862760 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.862784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.862801 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:16Z","lastTransitionTime":"2025-12-16T14:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.965764 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.965830 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.965847 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.965871 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:16 crc kubenswrapper[4728]: I1216 14:58:16.965887 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:16Z","lastTransitionTime":"2025-12-16T14:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.069127 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.069185 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.069204 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.069226 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.069243 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:17Z","lastTransitionTime":"2025-12-16T14:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.171926 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.172080 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.172101 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.172126 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.172144 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:17Z","lastTransitionTime":"2025-12-16T14:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.275283 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.275349 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.275367 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.275392 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.275441 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:17Z","lastTransitionTime":"2025-12-16T14:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.378912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.378990 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.379012 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.379049 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.379071 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:17Z","lastTransitionTime":"2025-12-16T14:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.482350 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.482458 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.482482 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.482507 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.482530 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:17Z","lastTransitionTime":"2025-12-16T14:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.506048 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:17 crc kubenswrapper[4728]: E1216 14:58:17.506226 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.506364 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:17 crc kubenswrapper[4728]: E1216 14:58:17.506680 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.585851 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.585925 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.585947 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.585972 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.585991 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:17Z","lastTransitionTime":"2025-12-16T14:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.688793 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.688843 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.688867 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.688890 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.688906 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:17Z","lastTransitionTime":"2025-12-16T14:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.791941 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.792013 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.792036 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.792068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.792090 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:17Z","lastTransitionTime":"2025-12-16T14:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.895547 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.895601 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.895617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.895642 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.895665 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:17Z","lastTransitionTime":"2025-12-16T14:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.999592 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.999655 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.999673 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.999697 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:17 crc kubenswrapper[4728]: I1216 14:58:17.999713 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:17Z","lastTransitionTime":"2025-12-16T14:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.102826 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.102878 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.102898 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.102925 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.102945 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:18Z","lastTransitionTime":"2025-12-16T14:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.205444 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.205532 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.205557 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.205587 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.205610 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:18Z","lastTransitionTime":"2025-12-16T14:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.308225 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.308270 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.308279 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.308292 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.308300 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:18Z","lastTransitionTime":"2025-12-16T14:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.410872 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.410939 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.410956 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.410981 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.410997 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:18Z","lastTransitionTime":"2025-12-16T14:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.505960 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.505994 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:18 crc kubenswrapper[4728]: E1216 14:58:18.506237 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:18 crc kubenswrapper[4728]: E1216 14:58:18.506343 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.513946 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.514111 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.514130 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.514153 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.514192 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:18Z","lastTransitionTime":"2025-12-16T14:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.616661 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.616720 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.616735 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.616758 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.616775 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:18Z","lastTransitionTime":"2025-12-16T14:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.719638 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.720313 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.720354 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.720377 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.720392 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:18Z","lastTransitionTime":"2025-12-16T14:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.823455 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.823500 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.823513 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.823530 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.823546 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:18Z","lastTransitionTime":"2025-12-16T14:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.926273 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.926320 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.926334 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.926352 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:18 crc kubenswrapper[4728]: I1216 14:58:18.926366 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:18Z","lastTransitionTime":"2025-12-16T14:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.028757 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.028818 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.028831 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.028851 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.028864 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:19Z","lastTransitionTime":"2025-12-16T14:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.131611 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.131665 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.131677 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.131696 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.131710 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:19Z","lastTransitionTime":"2025-12-16T14:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.244538 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.244574 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.244582 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.244595 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.244604 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:19Z","lastTransitionTime":"2025-12-16T14:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.347844 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.347907 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.347924 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.347949 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.347971 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:19Z","lastTransitionTime":"2025-12-16T14:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.451546 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.451619 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.451644 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.451679 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.451702 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:19Z","lastTransitionTime":"2025-12-16T14:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.505570 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.505654 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:19 crc kubenswrapper[4728]: E1216 14:58:19.505752 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:19 crc kubenswrapper[4728]: E1216 14:58:19.505866 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.524855 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.546336 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.558912 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.558964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.558982 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.559009 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.559026 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:19Z","lastTransitionTime":"2025-12-16T14:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.564452 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.581007 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.600680 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.625561 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.646274 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.661125 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.661186 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.661203 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.661228 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.661244 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:19Z","lastTransitionTime":"2025-12-16T14:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.663553 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.694348 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.712978 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.735571 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.758811 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:54Z\\\",\\\"message\\\":\\\"resses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.253],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 14:57:54.404005 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404010 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1216 14:57:54.404015 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404022 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1216 14:57:54.404028 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1216 14:57:54.404029 6389 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nI1216 14:57:54.403887 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-hbwhm\\\\nI1216 14:57:54.404045 6389 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.764439 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.764500 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.764519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.764542 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.764559 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:19Z","lastTransitionTime":"2025-12-16T14:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.773038 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.785445 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10211d5c-e094-410e-9995-070bc6d4b926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.802752 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"2025-12-16T14:57:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d\\\\n2025-12-16T14:57:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d to /host/opt/cni/bin/\\\\n2025-12-16T14:57:25Z [verbose] multus-daemon started\\\\n2025-12-16T14:57:25Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:58:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.813369 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.824156 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.837602 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529fdb48-e1b7-4a69-a177-19750ae97b6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e18726ab68389800cdf480fc5d00a391d68a7aa384b1656b3fa1bc78e74930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.851270 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:19Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.866092 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.866148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.866166 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.866189 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.866206 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:19Z","lastTransitionTime":"2025-12-16T14:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.969593 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.969675 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.969697 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.969719 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:19 crc kubenswrapper[4728]: I1216 14:58:19.969736 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:19Z","lastTransitionTime":"2025-12-16T14:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.072905 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.072965 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.072982 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.073029 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.073046 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.176272 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.176332 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.176354 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.176383 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.176440 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.277979 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.278043 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.278063 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.278089 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.278113 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: E1216 14:58:20.300817 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.306802 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.306861 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.306883 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.306914 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.306934 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: E1216 14:58:20.326909 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.331892 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.331964 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.331982 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.332008 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.332026 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: E1216 14:58:20.351363 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.356631 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.356692 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.356710 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.356733 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.356751 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: E1216 14:58:20.371641 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.376498 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.376539 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.376558 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.376582 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.376599 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: E1216 14:58:20.403332 4728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0442088b-ca61-48c2-99d7-338f049fa924\\\",\\\"systemUUID\\\":\\\"6cdaa06a-6501-425c-95d7-724f6caa86b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:20Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:20 crc kubenswrapper[4728]: E1216 14:58:20.403673 4728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.406238 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.406298 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.406316 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.406343 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.406364 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.505665 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.505723 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:20 crc kubenswrapper[4728]: E1216 14:58:20.505862 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:20 crc kubenswrapper[4728]: E1216 14:58:20.506503 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.506994 4728 scope.go:117] "RemoveContainer" containerID="bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.508948 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.508986 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.509003 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.509029 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.509045 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.612145 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.612391 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.612502 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.612599 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.612681 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.715476 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.715524 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.715539 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.715562 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.715578 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.818227 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.818255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.818264 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.818281 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.818291 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.920619 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.920650 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.920657 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.920669 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:20 crc kubenswrapper[4728]: I1216 14:58:20.920677 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:20Z","lastTransitionTime":"2025-12-16T14:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.016722 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/2.log" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.019621 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b"} Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.020267 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.023223 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.023259 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.023273 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.023289 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.023300 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:21Z","lastTransitionTime":"2025-12-16T14:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.037220 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.054503 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.067485 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.082448 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.101010 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:54Z\\\",\\\"message\\\":\\\"resses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.253],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 14:57:54.404005 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404010 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1216 14:57:54.404015 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404022 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1216 14:57:54.404028 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1216 14:57:54.404029 6389 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nI1216 14:57:54.403887 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-hbwhm\\\\nI1216 14:57:54.404045 6389 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.115735 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.125651 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.125673 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.125680 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.125708 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.125717 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:21Z","lastTransitionTime":"2025-12-16T14:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.132531 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10211d5c-e094-410e-9995-070bc6d4b926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.147747 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"2025-12-16T14:57:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d\\\\n2025-12-16T14:57:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d to /host/opt/cni/bin/\\\\n2025-12-16T14:57:25Z [verbose] multus-daemon started\\\\n2025-12-16T14:57:25Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:58:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.170861 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.184021 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.198644 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529fdb48-e1b7-4a69-a177-19750ae97b6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e18726ab68389800cdf480fc5d00a391d68a7aa384b1656b3fa1bc78e74930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.217147 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.227845 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.228002 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.228081 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.228163 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.228241 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:21Z","lastTransitionTime":"2025-12-16T14:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.231033 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.247309 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.261256 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.275885 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.293269 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.307155 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.317793 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:21Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.330970 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.331026 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.331044 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.331068 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.331085 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:21Z","lastTransitionTime":"2025-12-16T14:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.433900 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.433975 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.433999 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.434030 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.434051 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:21Z","lastTransitionTime":"2025-12-16T14:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.506156 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.506190 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:21 crc kubenswrapper[4728]: E1216 14:58:21.506393 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:21 crc kubenswrapper[4728]: E1216 14:58:21.506600 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.536517 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.536571 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.536593 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.536618 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.536640 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:21Z","lastTransitionTime":"2025-12-16T14:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.639458 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.639772 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.639898 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.640053 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.640183 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:21Z","lastTransitionTime":"2025-12-16T14:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.743662 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.743744 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.743769 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.743799 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.743822 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:21Z","lastTransitionTime":"2025-12-16T14:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.847032 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.847097 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.847115 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.847139 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.847158 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:21Z","lastTransitionTime":"2025-12-16T14:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.950540 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.950632 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.950652 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.950675 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:21 crc kubenswrapper[4728]: I1216 14:58:21.950692 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:21Z","lastTransitionTime":"2025-12-16T14:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.026523 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/3.log" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.028333 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/2.log" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.033043 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" exitCode=1 Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.033119 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b"} Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.033914 4728 scope.go:117] "RemoveContainer" containerID="bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.034181 4728 scope.go:117] "RemoveContainer" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.034527 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.053904 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.053962 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.053985 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.054017 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.054041 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:22Z","lastTransitionTime":"2025-12-16T14:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.054713 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529fdb48-e1b7-4a69-a177-19750ae97b6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e18726ab68389800cdf480fc5d00a391d68a7aa384b1656b3fa1bc78e74930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.094817 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.117633 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.140586 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.156686 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.156733 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.156746 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.156765 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.156779 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:22Z","lastTransitionTime":"2025-12-16T14:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.157526 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.171762 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.185060 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.197590 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.209344 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.221826 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.254748 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.259751 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.259803 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.259817 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.259837 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.259850 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:22Z","lastTransitionTime":"2025-12-16T14:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.271441 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.291151 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.311971 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.325764 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10211d5c-e094-410e-9995-070bc6d4b926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.344521 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"2025-12-16T14:57:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d\\\\n2025-12-16T14:57:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d to /host/opt/cni/bin/\\\\n2025-12-16T14:57:25Z [verbose] multus-daemon started\\\\n2025-12-16T14:57:25Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:58:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.363037 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.363092 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.363101 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.363118 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.363129 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:22Z","lastTransitionTime":"2025-12-16T14:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.370485 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.385587 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.385738 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.385715075 +0000 UTC m=+147.225894059 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.392475 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd54d0abdf5ea0b38c47a974d38bde5cc0e623ee89cb83daec7d35f42cc08458\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:57:54Z\\\",\\\"message\\\":\\\"resses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.253],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1216 14:57:54.404005 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404010 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1216 14:57:54.404015 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1216 14:57:54.404022 6389 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1216 14:57:54.404028 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1216 14:57:54.404029 6389 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nI1216 14:57:54.403887 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-hbwhm\\\\nI1216 14:57:54.404045 6389 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:58:21Z\\\",\\\"message\\\":\\\"ice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 14:58:21.336973 6804 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:58:21.336987 6804 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:58:21.336887 6804 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:58:21.336993 6804 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 14:58:21.336966 6804 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:58:21.337176 6804 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:58:21.337195 6804 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 14:58:21.337205 6804 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:58:21.337213 6804 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:58:21.337226 6804 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 14:58:21.337621 6804 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 14:58:21.337952 6804 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.410050 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:22Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.466230 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.466286 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.466304 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.466330 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.466347 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:22Z","lastTransitionTime":"2025-12-16T14:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.486244 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.486297 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.486329 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.486361 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486472 4728 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486549 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486569 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.486543896 +0000 UTC m=+147.326722920 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486570 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486592 4728 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486472 4728 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486644 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.486627738 +0000 UTC m=+147.326806812 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486693 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.486664769 +0000 UTC m=+147.326843833 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486807 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486833 4728 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486851 4728 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.486922 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.486901076 +0000 UTC m=+147.327080160 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.505556 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.505615 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.505720 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:22 crc kubenswrapper[4728]: E1216 14:58:22.505813 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.568725 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.568763 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.568774 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.568789 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.568800 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:22Z","lastTransitionTime":"2025-12-16T14:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.671722 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.671760 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.671769 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.671784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.671793 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:22Z","lastTransitionTime":"2025-12-16T14:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.774311 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.774384 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.774458 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.774493 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.774516 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:22Z","lastTransitionTime":"2025-12-16T14:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.877429 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.877487 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.877504 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.877522 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.877534 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:22Z","lastTransitionTime":"2025-12-16T14:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.980977 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.981047 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.981063 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.981089 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:22 crc kubenswrapper[4728]: I1216 14:58:22.981106 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:22Z","lastTransitionTime":"2025-12-16T14:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.045400 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/3.log" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.051772 4728 scope.go:117] "RemoveContainer" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 14:58:23 crc kubenswrapper[4728]: E1216 14:58:23.052250 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.069658 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.084886 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.084975 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.084994 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.085019 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.085037 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:23Z","lastTransitionTime":"2025-12-16T14:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.088957 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.106055 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hbwhm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2df7f0f-2588-4958-90ec-db2d62025b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36bd7817a7b1e28d27f6f7699af670192871c32c1ad47dcdae807b98527b7892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xfx6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hbwhm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.124195 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529fdb48-e1b7-4a69-a177-19750ae97b6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e18726ab68389800cdf480fc5d00a391d68a7aa384b1656b3fa1bc78e74930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.145619 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d07f523dc726446a33edbfa8bdd4a33c09f5d04a158e8f0bc8e6161aa91feba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.160932 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d1435b2a17009c042c713284ceb0a87a590565874e1fd14de9bc4177b6be1d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.179886 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.188826 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.188908 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.188935 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.188967 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.188992 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:23Z","lastTransitionTime":"2025-12-16T14:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.197187 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6lqf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e13f8ca5-bf05-4740-be7d-81af5e57172b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd8b4dd08e64ba923c96f9cd0b9b64f5d30a46f044287b6d15de6d81e595a3d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbk27\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6lqf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.214602 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d13ff897-af48-416f-ba3f-44f7e4344a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mzfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kjxbh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.237774 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008cd71a-a642-43c5-8aa2-98db283d9c45\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"le observer\\\\nW1216 14:57:17.728756 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1216 14:57:17.729179 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1216 14:57:17.831292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2670081420/tls.crt::/tmp/serving-cert-2670081420/tls.key\\\\\\\"\\\\nI1216 14:57:18.329860 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1216 14:57:18.331924 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1216 14:57:18.331972 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1216 14:57:18.332022 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1216 14:57:18.332048 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1216 14:57:18.337447 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1216 14:57:18.337467 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337473 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1216 14:57:18.337478 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1216 14:57:18.337481 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1216 14:57:18.337484 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1216 14:57:18.337488 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1216 14:57:18.337530 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1216 14:57:18.340715 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.257852 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"160a926c-b24e-4068-8298-8c875c2027b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3a7006a35ad0c042328221dbce21e247c65b64f13c4e9fee2ae9fa05426fe6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63eb61a60d958fad06f355dfad5fe9aa562e053c3d08cfd530145387b5ae8127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94092fc3fda6c56d5f2575ab53407531e8291b6265fb3f483de23bf58dfcfd7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.278147 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.292031 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.292109 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.292134 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.292165 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.292185 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:23Z","lastTransitionTime":"2025-12-16T14:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.299279 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a09e8098618cba74ca8b48a5e518901cb30ad9aa2c186cc5c2476cc768e3a61b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9449cafecba3b3878802345b34e9ee80683f7fa75397a9dd4f2934e470728bba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.334611 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c852f82-3d2a-461a-9129-1fe6636e22e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0393b67e789fb4f396ab65e319fac02db494dbb0d4de738ca72e4be1ac27dc95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://428d37177248a08d508c0e43d245ca51ec799844c837892baa82ebcef5146f3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02ccc230198207ac607a19b9e6fa8ad3e87fce3a27f2f09949a08e5946a93002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://382cf96c41b41817ef08498f383d844cc5e6b530948b5e5f2cbcfc45b11daece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c1826a89e90578ead56343de089d042979889aae3a842e579e4d5931ac8622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3614d8862d903a88a873b906aacda89f2c76e4276701541ddeed0d8234d1ff20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400f788464553282c80230c144e9475bdf2200545456f4d47412989f476125d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1065ce939ae62a5ede2a52341e2ef02a54999e11e19a2110bc3adaef9e3bd95c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.358734 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bdpsg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57f7e48b-7353-469c-ab9d-7f966c08d5f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:58:10Z\\\",\\\"message\\\":\\\"2025-12-16T14:57:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d\\\\n2025-12-16T14:57:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6d2fc6df-c0a3-4b4e-baf3-f40b64753a8d to /host/opt/cni/bin/\\\\n2025-12-16T14:57:25Z [verbose] multus-daemon started\\\\n2025-12-16T14:57:25Z [verbose] Readiness Indicator file check\\\\n2025-12-16T14:58:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8f7f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bdpsg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.381605 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"290f0e95-e5fa-4b56-acb0-babc0cf3c5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5a1ff2403634be0766ce57ce90eee729b40c87ba8d65ac8800f062d16287421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e67bd33d6d3caf01522c34faa592ca0845c69ebbc3cfeb110839cddad2f05981\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c57277c734c93f476c39fc7e6fa75488cc68347c588d5b39fb131b6c11ea03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fbd4de1ef325c099aaf6c0f4d748fd913262471564564c45abaa003085c85b80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e011937b47242177b21099fffb9dd4529d85d8e9c7b4d67fe5cd7d2e5e6826fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b698d52e695cb0af2a220499043c0c236295f76bf177eac05e9660f8102a572\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371af83a1ae114b8ad2a8b716ac2ac0066440251c546aa78b99519db203bccb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvx6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9nv7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.397628 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.397696 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.397715 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.397743 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.397762 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:23Z","lastTransitionTime":"2025-12-16T14:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.415634 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"480f8c1b-60cc-4685-86cc-a457f645e87c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-16T14:58:21Z\\\",\\\"message\\\":\\\"ice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 14:58:21.336973 6804 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1216 14:58:21.336987 6804 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1216 14:58:21.336887 6804 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1216 14:58:21.336993 6804 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1216 14:58:21.336966 6804 handler.go:208] Removed *v1.Node event handler 7\\\\nI1216 14:58:21.337176 6804 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1216 14:58:21.337195 6804 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1216 14:58:21.337205 6804 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1216 14:58:21.337213 6804 handler.go:208] Removed *v1.Node event handler 2\\\\nI1216 14:58:21.337226 6804 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 14:58:21.337621 6804 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1216 14:58:21.337952 6804 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-16T14:58:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jwj7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2458v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.434251 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc79473-539a-47f0-ba53-932ad31f7422\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df992c7fc31b89f42ebcb89c0cc3cef4b7364c79387ab1ed019ab0dd71c866db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://105141ee8cc33994fd51505555ef655219e606e3547ebc98363e2790347c5018\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tcz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2gtsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.453003 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10211d5c-e094-410e-9995-070bc6d4b926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3592b2aef00e2ca759e170811d1387c35889058be030c9ef961e200bd6fcbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6984da2c6a3ef93c8ba013f2688b2ef55b72c2d7d2cbe32d5b8aa5347cfe307e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f65fbdd8e0d40e631941df1a43a5a3779d2bba6ea9464374e021578d2bcb8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b5d1241f4b7fd7bbe24e40666aa8bf083180ee9506121efc9e3d297cb7c7775\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:23Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.501140 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.501211 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.501229 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.501254 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.501280 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:23Z","lastTransitionTime":"2025-12-16T14:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.505463 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.505462 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:23 crc kubenswrapper[4728]: E1216 14:58:23.505645 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:23 crc kubenswrapper[4728]: E1216 14:58:23.505820 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.604336 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.604398 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.604462 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.604487 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.604503 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:23Z","lastTransitionTime":"2025-12-16T14:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.707901 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.707965 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.707987 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.708511 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.708539 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:23Z","lastTransitionTime":"2025-12-16T14:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.811860 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.811934 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.811968 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.811999 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.812022 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:23Z","lastTransitionTime":"2025-12-16T14:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.914605 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.914704 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.914728 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.914758 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:23 crc kubenswrapper[4728]: I1216 14:58:23.914777 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:23Z","lastTransitionTime":"2025-12-16T14:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.017818 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.017879 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.017891 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.017915 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.017935 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:24Z","lastTransitionTime":"2025-12-16T14:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.120996 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.121036 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.121048 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.121066 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.121077 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:24Z","lastTransitionTime":"2025-12-16T14:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.224290 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.224355 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.224373 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.224397 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.224458 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:24Z","lastTransitionTime":"2025-12-16T14:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.327894 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.327942 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.327952 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.327968 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.327980 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:24Z","lastTransitionTime":"2025-12-16T14:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.431298 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.431364 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.431382 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.431434 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.431452 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:24Z","lastTransitionTime":"2025-12-16T14:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.505487 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.505591 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:24 crc kubenswrapper[4728]: E1216 14:58:24.505700 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:24 crc kubenswrapper[4728]: E1216 14:58:24.506312 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.534519 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.534567 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.534596 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.534614 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.534624 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:24Z","lastTransitionTime":"2025-12-16T14:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.638058 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.638131 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.638158 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.638186 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.638205 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:24Z","lastTransitionTime":"2025-12-16T14:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.740885 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.740951 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.740973 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.741000 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.741016 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:24Z","lastTransitionTime":"2025-12-16T14:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.843238 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.843265 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.843273 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.843285 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.843294 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:24Z","lastTransitionTime":"2025-12-16T14:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.946336 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.946431 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.946450 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.946476 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:24 crc kubenswrapper[4728]: I1216 14:58:24.946495 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:24Z","lastTransitionTime":"2025-12-16T14:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.049454 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.049513 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.049529 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.049552 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.049569 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:25Z","lastTransitionTime":"2025-12-16T14:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.152064 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.152135 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.152152 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.152176 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.152193 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:25Z","lastTransitionTime":"2025-12-16T14:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.255471 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.255531 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.255548 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.255571 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.255591 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:25Z","lastTransitionTime":"2025-12-16T14:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.358225 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.358283 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.358299 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.358353 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.358372 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:25Z","lastTransitionTime":"2025-12-16T14:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.461490 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.461540 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.461556 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.461578 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.461594 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:25Z","lastTransitionTime":"2025-12-16T14:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.506287 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:25 crc kubenswrapper[4728]: E1216 14:58:25.506518 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.506578 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:25 crc kubenswrapper[4728]: E1216 14:58:25.506730 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.564753 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.564821 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.564845 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.564872 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.564889 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:25Z","lastTransitionTime":"2025-12-16T14:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.666897 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.666931 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.666938 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.666951 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.666959 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:25Z","lastTransitionTime":"2025-12-16T14:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.769563 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.769614 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.769630 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.769653 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.769669 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:25Z","lastTransitionTime":"2025-12-16T14:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.872678 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.872761 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.872784 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.872818 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.872838 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:25Z","lastTransitionTime":"2025-12-16T14:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.975719 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.975972 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.976089 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.976187 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:25 crc kubenswrapper[4728]: I1216 14:58:25.976267 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:25Z","lastTransitionTime":"2025-12-16T14:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.078478 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.078565 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.078588 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.078614 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.078632 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:26Z","lastTransitionTime":"2025-12-16T14:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.181629 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.181698 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.181720 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.181749 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.181768 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:26Z","lastTransitionTime":"2025-12-16T14:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.285069 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.285123 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.285145 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.285175 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.285197 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:26Z","lastTransitionTime":"2025-12-16T14:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.388232 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.388290 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.388341 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.388370 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.388393 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:26Z","lastTransitionTime":"2025-12-16T14:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.491836 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.491910 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.491932 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.491960 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.491979 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:26Z","lastTransitionTime":"2025-12-16T14:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.505294 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.505673 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:26 crc kubenswrapper[4728]: E1216 14:58:26.505778 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:26 crc kubenswrapper[4728]: E1216 14:58:26.506004 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.594883 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.594983 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.594999 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.595025 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.595043 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:26Z","lastTransitionTime":"2025-12-16T14:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.697566 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.697642 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.697656 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.697680 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.697697 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:26Z","lastTransitionTime":"2025-12-16T14:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.800234 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.800272 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.800284 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.800299 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.800312 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:26Z","lastTransitionTime":"2025-12-16T14:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.902505 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.902566 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.902587 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.902617 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:26 crc kubenswrapper[4728]: I1216 14:58:26.902638 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:26Z","lastTransitionTime":"2025-12-16T14:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.006287 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.006362 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.006384 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.006473 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.006501 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:27Z","lastTransitionTime":"2025-12-16T14:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.109056 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.109125 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.109148 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.109177 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.109228 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:27Z","lastTransitionTime":"2025-12-16T14:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.211706 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.211762 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.211780 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.211805 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.211828 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:27Z","lastTransitionTime":"2025-12-16T14:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.315183 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.315239 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.315255 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.315278 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.315294 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:27Z","lastTransitionTime":"2025-12-16T14:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.418714 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.418792 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.418810 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.418897 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.418914 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:27Z","lastTransitionTime":"2025-12-16T14:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.506103 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.506216 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:27 crc kubenswrapper[4728]: E1216 14:58:27.506305 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:27 crc kubenswrapper[4728]: E1216 14:58:27.506452 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.522119 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.522186 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.522214 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.522241 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.522262 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:27Z","lastTransitionTime":"2025-12-16T14:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.625919 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.625978 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.625996 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.626021 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.626039 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:27Z","lastTransitionTime":"2025-12-16T14:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.729354 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.729484 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.729510 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.729536 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.729557 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:27Z","lastTransitionTime":"2025-12-16T14:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.834141 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.834220 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.834237 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.834263 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.834281 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:27Z","lastTransitionTime":"2025-12-16T14:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.937447 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.937509 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.937525 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.937549 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:27 crc kubenswrapper[4728]: I1216 14:58:27.937566 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:27Z","lastTransitionTime":"2025-12-16T14:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.041552 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.041607 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.041624 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.041649 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.041666 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:28Z","lastTransitionTime":"2025-12-16T14:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.143678 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.143727 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.143738 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.143756 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.143768 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:28Z","lastTransitionTime":"2025-12-16T14:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.246319 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.246381 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.246398 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.246455 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.246474 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:28Z","lastTransitionTime":"2025-12-16T14:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.350095 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.350180 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.350205 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.350237 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.350259 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:28Z","lastTransitionTime":"2025-12-16T14:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.453098 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.453200 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.453221 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.453246 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.453263 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:28Z","lastTransitionTime":"2025-12-16T14:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.505992 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.506001 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:28 crc kubenswrapper[4728]: E1216 14:58:28.506163 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:28 crc kubenswrapper[4728]: E1216 14:58:28.506274 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.556001 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.556080 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.556101 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.556133 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.556157 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:28Z","lastTransitionTime":"2025-12-16T14:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.659393 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.659515 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.659538 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.659570 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.659594 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:28Z","lastTransitionTime":"2025-12-16T14:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.762322 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.762381 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.762395 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.762439 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.762454 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:28Z","lastTransitionTime":"2025-12-16T14:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.865612 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.865673 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.865690 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.865714 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.865731 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:28Z","lastTransitionTime":"2025-12-16T14:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.969178 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.969234 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.969253 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.969275 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:28 crc kubenswrapper[4728]: I1216 14:58:28.969292 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:28Z","lastTransitionTime":"2025-12-16T14:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.071356 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.071474 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.071510 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.071542 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.071564 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:29Z","lastTransitionTime":"2025-12-16T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.174139 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.174194 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.174211 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.174233 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.174249 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:29Z","lastTransitionTime":"2025-12-16T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.278031 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.278531 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.278749 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.278985 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.279268 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:29Z","lastTransitionTime":"2025-12-16T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.382196 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.382246 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.382262 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.382285 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.382304 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:29Z","lastTransitionTime":"2025-12-16T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.491288 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.491364 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.491382 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.491429 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.491447 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:29Z","lastTransitionTime":"2025-12-16T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.505879 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:29 crc kubenswrapper[4728]: E1216 14:58:29.506036 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.506293 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:29 crc kubenswrapper[4728]: E1216 14:58:29.506730 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.525915 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"529fdb48-e1b7-4a69-a177-19750ae97b6e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:56:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09e18726ab68389800cdf480fc5d00a391d68a7aa384b1656b3fa1bc78e74930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cfd94a4a5e01be67441b21769ec401bbceb210c00ed13bf7e69983f4f380732\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-16T14:57:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-16T14:57:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:56:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.542184 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.559966 4728 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5cdc17e-067e-4d74-b768-02966221d3ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-16T14:57:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40a116c04d9ad37ca069e1fec97e1c707395bbea8976edd77a6eb9290dc6f958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-16T14:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj8dq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-16T14:57:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-njzmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-16T14:58:29Z is after 2025-08-24T17:21:41Z" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.593992 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.594064 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.594090 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.594122 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.594145 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:29Z","lastTransitionTime":"2025-12-16T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.614647 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=71.61462315 podStartE2EDuration="1m11.61462315s" podCreationTimestamp="2025-12-16 14:57:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:29.613662814 +0000 UTC m=+90.453841828" watchObservedRunningTime="2025-12-16 14:58:29.61462315 +0000 UTC m=+90.454802174" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.615278 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hbwhm" podStartSLOduration=66.615267988 podStartE2EDuration="1m6.615267988s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:29.586594108 +0000 UTC m=+90.426773102" watchObservedRunningTime="2025-12-16 14:58:29.615267988 +0000 UTC m=+90.455447002" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.696732 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.696779 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.696795 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.696815 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.696827 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:29Z","lastTransitionTime":"2025-12-16T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.704109 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6lqf6" podStartSLOduration=66.704083502 podStartE2EDuration="1m6.704083502s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:29.702344435 +0000 UTC m=+90.542523449" watchObservedRunningTime="2025-12-16 14:58:29.704083502 +0000 UTC m=+90.544262496" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.754985 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.754960714 podStartE2EDuration="1m8.754960714s" podCreationTimestamp="2025-12-16 14:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:29.752022695 +0000 UTC m=+90.592201709" watchObservedRunningTime="2025-12-16 14:58:29.754960714 +0000 UTC m=+90.595139728" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.793321 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=69.793298097 podStartE2EDuration="1m9.793298097s" podCreationTimestamp="2025-12-16 14:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:29.773894419 +0000 UTC m=+90.614073453" watchObservedRunningTime="2025-12-16 14:58:29.793298097 +0000 UTC m=+90.633477091" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.799302 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.799348 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.799366 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.799388 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.799428 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:29Z","lastTransitionTime":"2025-12-16T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.832261 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=37.832227635 podStartE2EDuration="37.832227635s" podCreationTimestamp="2025-12-16 14:57:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:29.83204146 +0000 UTC m=+90.672220484" watchObservedRunningTime="2025-12-16 14:58:29.832227635 +0000 UTC m=+90.672406669" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.849284 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bdpsg" podStartSLOduration=66.849263098 podStartE2EDuration="1m6.849263098s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:29.849222567 +0000 UTC m=+90.689401571" watchObservedRunningTime="2025-12-16 14:58:29.849263098 +0000 UTC m=+90.689442122" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.869462 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9nv7n" podStartSLOduration=66.869437896 podStartE2EDuration="1m6.869437896s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:29.8684555 +0000 UTC m=+90.708634494" watchObservedRunningTime="2025-12-16 14:58:29.869437896 +0000 UTC m=+90.709616920" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.902393 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.902488 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.902507 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.902533 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:29 crc kubenswrapper[4728]: I1216 14:58:29.902550 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:29Z","lastTransitionTime":"2025-12-16T14:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.005485 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.005548 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.005566 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.005590 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.005608 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:30Z","lastTransitionTime":"2025-12-16T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.108187 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.108252 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.108275 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.108305 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.108329 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:30Z","lastTransitionTime":"2025-12-16T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.211294 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.211364 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.211387 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.211467 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.211502 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:30Z","lastTransitionTime":"2025-12-16T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.314768 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.314841 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.314855 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.314880 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.314893 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:30Z","lastTransitionTime":"2025-12-16T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.418385 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.418540 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.418562 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.418597 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.418616 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:30Z","lastTransitionTime":"2025-12-16T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.505736 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.505773 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:30 crc kubenswrapper[4728]: E1216 14:58:30.505866 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:30 crc kubenswrapper[4728]: E1216 14:58:30.505979 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.521288 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.521399 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.521481 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.521512 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.521537 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:30Z","lastTransitionTime":"2025-12-16T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.624119 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.624178 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.624195 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.624218 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.624261 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:30Z","lastTransitionTime":"2025-12-16T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.727066 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.727129 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.727142 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.727159 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.727170 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:30Z","lastTransitionTime":"2025-12-16T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.737825 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.737885 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.737899 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.737913 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.737922 4728 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-16T14:58:30Z","lastTransitionTime":"2025-12-16T14:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.784613 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2gtsc" podStartSLOduration=67.784587543 podStartE2EDuration="1m7.784587543s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:29.911673244 +0000 UTC m=+90.751852258" watchObservedRunningTime="2025-12-16 14:58:30.784587543 +0000 UTC m=+91.624766567" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.786492 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85"] Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.787180 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.789780 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.790099 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.790325 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.790386 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.802916 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podStartSLOduration=67.80289168 podStartE2EDuration="1m7.80289168s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:30.801202365 +0000 UTC m=+91.641381429" watchObservedRunningTime="2025-12-16 14:58:30.80289168 +0000 UTC m=+91.643070704" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.837491 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=19.83745176 podStartE2EDuration="19.83745176s" podCreationTimestamp="2025-12-16 14:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:30.836840034 +0000 UTC m=+91.677019028" watchObservedRunningTime="2025-12-16 14:58:30.83745176 +0000 UTC m=+91.677630754" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.879239 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/88352cfb-adfc-45ef-968b-50978a49ebcf-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.879309 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/88352cfb-adfc-45ef-968b-50978a49ebcf-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.879334 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88352cfb-adfc-45ef-968b-50978a49ebcf-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.879355 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88352cfb-adfc-45ef-968b-50978a49ebcf-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.879608 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88352cfb-adfc-45ef-968b-50978a49ebcf-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.980804 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/88352cfb-adfc-45ef-968b-50978a49ebcf-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.981168 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88352cfb-adfc-45ef-968b-50978a49ebcf-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.981344 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88352cfb-adfc-45ef-968b-50978a49ebcf-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.981613 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88352cfb-adfc-45ef-968b-50978a49ebcf-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.981831 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/88352cfb-adfc-45ef-968b-50978a49ebcf-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.982072 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/88352cfb-adfc-45ef-968b-50978a49ebcf-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.981012 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/88352cfb-adfc-45ef-968b-50978a49ebcf-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.983091 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88352cfb-adfc-45ef-968b-50978a49ebcf-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:30 crc kubenswrapper[4728]: I1216 14:58:30.990852 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88352cfb-adfc-45ef-968b-50978a49ebcf-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:31 crc kubenswrapper[4728]: I1216 14:58:31.011537 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/88352cfb-adfc-45ef-968b-50978a49ebcf-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t9h85\" (UID: \"88352cfb-adfc-45ef-968b-50978a49ebcf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:31 crc kubenswrapper[4728]: I1216 14:58:31.112776 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" Dec 16 14:58:31 crc kubenswrapper[4728]: W1216 14:58:31.133295 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88352cfb_adfc_45ef_968b_50978a49ebcf.slice/crio-59f9dc4db044bc9b28f368191516b628b7615a5b16a48a32bc41c3de56222751 WatchSource:0}: Error finding container 59f9dc4db044bc9b28f368191516b628b7615a5b16a48a32bc41c3de56222751: Status 404 returned error can't find the container with id 59f9dc4db044bc9b28f368191516b628b7615a5b16a48a32bc41c3de56222751 Dec 16 14:58:31 crc kubenswrapper[4728]: I1216 14:58:31.505977 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:31 crc kubenswrapper[4728]: I1216 14:58:31.505980 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:31 crc kubenswrapper[4728]: E1216 14:58:31.506337 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:31 crc kubenswrapper[4728]: E1216 14:58:31.506183 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:32 crc kubenswrapper[4728]: I1216 14:58:32.082825 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" event={"ID":"88352cfb-adfc-45ef-968b-50978a49ebcf","Type":"ContainerStarted","Data":"544a7b37ccec7d996ba218f9348e053318c465a7601d90ae0ec1331a2ff2b04e"} Dec 16 14:58:32 crc kubenswrapper[4728]: I1216 14:58:32.082868 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" event={"ID":"88352cfb-adfc-45ef-968b-50978a49ebcf","Type":"ContainerStarted","Data":"59f9dc4db044bc9b28f368191516b628b7615a5b16a48a32bc41c3de56222751"} Dec 16 14:58:32 crc kubenswrapper[4728]: I1216 14:58:32.100388 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t9h85" podStartSLOduration=69.100377161 podStartE2EDuration="1m9.100377161s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:58:32.100266138 +0000 UTC m=+92.940445122" watchObservedRunningTime="2025-12-16 14:58:32.100377161 +0000 UTC m=+92.940556145" Dec 16 14:58:32 crc kubenswrapper[4728]: I1216 14:58:32.505913 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:32 crc kubenswrapper[4728]: I1216 14:58:32.505961 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:32 crc kubenswrapper[4728]: E1216 14:58:32.506043 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:32 crc kubenswrapper[4728]: E1216 14:58:32.506178 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:33 crc kubenswrapper[4728]: I1216 14:58:33.506448 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:33 crc kubenswrapper[4728]: I1216 14:58:33.506468 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:33 crc kubenswrapper[4728]: E1216 14:58:33.506662 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:33 crc kubenswrapper[4728]: E1216 14:58:33.506853 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:34 crc kubenswrapper[4728]: I1216 14:58:34.506338 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:34 crc kubenswrapper[4728]: I1216 14:58:34.506450 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:34 crc kubenswrapper[4728]: E1216 14:58:34.506570 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:34 crc kubenswrapper[4728]: E1216 14:58:34.506719 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:35 crc kubenswrapper[4728]: I1216 14:58:35.510752 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:35 crc kubenswrapper[4728]: E1216 14:58:35.511132 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:35 crc kubenswrapper[4728]: I1216 14:58:35.512039 4728 scope.go:117] "RemoveContainer" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 14:58:35 crc kubenswrapper[4728]: E1216 14:58:35.512203 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" Dec 16 14:58:35 crc kubenswrapper[4728]: I1216 14:58:35.512351 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:35 crc kubenswrapper[4728]: E1216 14:58:35.512447 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:36 crc kubenswrapper[4728]: I1216 14:58:36.505459 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:36 crc kubenswrapper[4728]: I1216 14:58:36.505487 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:36 crc kubenswrapper[4728]: E1216 14:58:36.505607 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:36 crc kubenswrapper[4728]: E1216 14:58:36.505852 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:37 crc kubenswrapper[4728]: I1216 14:58:37.505684 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:37 crc kubenswrapper[4728]: E1216 14:58:37.505851 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:37 crc kubenswrapper[4728]: I1216 14:58:37.505688 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:37 crc kubenswrapper[4728]: E1216 14:58:37.506015 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:38 crc kubenswrapper[4728]: I1216 14:58:38.506218 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:38 crc kubenswrapper[4728]: I1216 14:58:38.506231 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:38 crc kubenswrapper[4728]: E1216 14:58:38.507056 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:38 crc kubenswrapper[4728]: E1216 14:58:38.507184 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:39 crc kubenswrapper[4728]: I1216 14:58:39.505625 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:39 crc kubenswrapper[4728]: I1216 14:58:39.505699 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:39 crc kubenswrapper[4728]: E1216 14:58:39.507820 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:39 crc kubenswrapper[4728]: E1216 14:58:39.507972 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:40 crc kubenswrapper[4728]: I1216 14:58:40.505688 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:40 crc kubenswrapper[4728]: I1216 14:58:40.505738 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:40 crc kubenswrapper[4728]: E1216 14:58:40.505891 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:40 crc kubenswrapper[4728]: E1216 14:58:40.506027 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:41 crc kubenswrapper[4728]: I1216 14:58:41.506343 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:41 crc kubenswrapper[4728]: I1216 14:58:41.506457 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:41 crc kubenswrapper[4728]: E1216 14:58:41.507447 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:41 crc kubenswrapper[4728]: E1216 14:58:41.507708 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:42 crc kubenswrapper[4728]: I1216 14:58:42.021619 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:42 crc kubenswrapper[4728]: E1216 14:58:42.021935 4728 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:58:42 crc kubenswrapper[4728]: E1216 14:58:42.022080 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs podName:d13ff897-af48-416f-ba3f-44f7e4344a75 nodeName:}" failed. No retries permitted until 2025-12-16 14:59:46.022040852 +0000 UTC m=+166.862219866 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs") pod "network-metrics-daemon-kjxbh" (UID: "d13ff897-af48-416f-ba3f-44f7e4344a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 16 14:58:42 crc kubenswrapper[4728]: I1216 14:58:42.505688 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:42 crc kubenswrapper[4728]: E1216 14:58:42.505852 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:42 crc kubenswrapper[4728]: I1216 14:58:42.505688 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:42 crc kubenswrapper[4728]: E1216 14:58:42.506145 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:43 crc kubenswrapper[4728]: I1216 14:58:43.506715 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:43 crc kubenswrapper[4728]: I1216 14:58:43.506809 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:43 crc kubenswrapper[4728]: E1216 14:58:43.506884 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:43 crc kubenswrapper[4728]: E1216 14:58:43.507045 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:44 crc kubenswrapper[4728]: I1216 14:58:44.505575 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:44 crc kubenswrapper[4728]: E1216 14:58:44.505818 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:44 crc kubenswrapper[4728]: I1216 14:58:44.505884 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:44 crc kubenswrapper[4728]: E1216 14:58:44.506064 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:45 crc kubenswrapper[4728]: I1216 14:58:45.505722 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:45 crc kubenswrapper[4728]: E1216 14:58:45.505918 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:45 crc kubenswrapper[4728]: I1216 14:58:45.506457 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:45 crc kubenswrapper[4728]: E1216 14:58:45.506691 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:46 crc kubenswrapper[4728]: I1216 14:58:46.506027 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:46 crc kubenswrapper[4728]: E1216 14:58:46.506752 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:46 crc kubenswrapper[4728]: I1216 14:58:46.506150 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:46 crc kubenswrapper[4728]: E1216 14:58:46.507134 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:46 crc kubenswrapper[4728]: I1216 14:58:46.507359 4728 scope.go:117] "RemoveContainer" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 14:58:46 crc kubenswrapper[4728]: E1216 14:58:46.507645 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" Dec 16 14:58:47 crc kubenswrapper[4728]: I1216 14:58:47.506040 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:47 crc kubenswrapper[4728]: E1216 14:58:47.506236 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:47 crc kubenswrapper[4728]: I1216 14:58:47.506308 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:47 crc kubenswrapper[4728]: E1216 14:58:47.506657 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:48 crc kubenswrapper[4728]: I1216 14:58:48.505847 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:48 crc kubenswrapper[4728]: E1216 14:58:48.505942 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:48 crc kubenswrapper[4728]: I1216 14:58:48.506035 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:48 crc kubenswrapper[4728]: E1216 14:58:48.506314 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:49 crc kubenswrapper[4728]: I1216 14:58:49.506290 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:49 crc kubenswrapper[4728]: I1216 14:58:49.509308 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:49 crc kubenswrapper[4728]: E1216 14:58:49.509913 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:49 crc kubenswrapper[4728]: E1216 14:58:49.510024 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:50 crc kubenswrapper[4728]: I1216 14:58:50.506321 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:50 crc kubenswrapper[4728]: I1216 14:58:50.506388 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:50 crc kubenswrapper[4728]: E1216 14:58:50.506556 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:50 crc kubenswrapper[4728]: E1216 14:58:50.506690 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:51 crc kubenswrapper[4728]: I1216 14:58:51.505883 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:51 crc kubenswrapper[4728]: I1216 14:58:51.505927 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:51 crc kubenswrapper[4728]: E1216 14:58:51.506116 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:51 crc kubenswrapper[4728]: E1216 14:58:51.506247 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:52 crc kubenswrapper[4728]: I1216 14:58:52.505924 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:52 crc kubenswrapper[4728]: I1216 14:58:52.505935 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:52 crc kubenswrapper[4728]: E1216 14:58:52.506466 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:52 crc kubenswrapper[4728]: E1216 14:58:52.506608 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:53 crc kubenswrapper[4728]: I1216 14:58:53.505634 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:53 crc kubenswrapper[4728]: I1216 14:58:53.505693 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:53 crc kubenswrapper[4728]: E1216 14:58:53.505789 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:53 crc kubenswrapper[4728]: E1216 14:58:53.505916 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:54 crc kubenswrapper[4728]: I1216 14:58:54.505614 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:54 crc kubenswrapper[4728]: I1216 14:58:54.505666 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:54 crc kubenswrapper[4728]: E1216 14:58:54.505777 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:54 crc kubenswrapper[4728]: E1216 14:58:54.505962 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:55 crc kubenswrapper[4728]: I1216 14:58:55.506338 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:55 crc kubenswrapper[4728]: I1216 14:58:55.506380 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:55 crc kubenswrapper[4728]: E1216 14:58:55.506634 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:55 crc kubenswrapper[4728]: E1216 14:58:55.506722 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:56 crc kubenswrapper[4728]: I1216 14:58:56.505924 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:56 crc kubenswrapper[4728]: I1216 14:58:56.505955 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:56 crc kubenswrapper[4728]: E1216 14:58:56.506163 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:56 crc kubenswrapper[4728]: E1216 14:58:56.506368 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:57 crc kubenswrapper[4728]: I1216 14:58:57.173801 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdpsg_57f7e48b-7353-469c-ab9d-7f966c08d5f1/kube-multus/1.log" Dec 16 14:58:57 crc kubenswrapper[4728]: I1216 14:58:57.174279 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdpsg_57f7e48b-7353-469c-ab9d-7f966c08d5f1/kube-multus/0.log" Dec 16 14:58:57 crc kubenswrapper[4728]: I1216 14:58:57.174340 4728 generic.go:334] "Generic (PLEG): container finished" podID="57f7e48b-7353-469c-ab9d-7f966c08d5f1" containerID="1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076" exitCode=1 Dec 16 14:58:57 crc kubenswrapper[4728]: I1216 14:58:57.174374 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdpsg" event={"ID":"57f7e48b-7353-469c-ab9d-7f966c08d5f1","Type":"ContainerDied","Data":"1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076"} Dec 16 14:58:57 crc kubenswrapper[4728]: I1216 14:58:57.174448 4728 scope.go:117] "RemoveContainer" containerID="25fbec1fd30e4412c1eed6f52647defc5032bb26b82bdb32b51bd8e0cfae8d30" Dec 16 14:58:57 crc kubenswrapper[4728]: I1216 14:58:57.174844 4728 scope.go:117] "RemoveContainer" containerID="1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076" Dec 16 14:58:57 crc kubenswrapper[4728]: E1216 14:58:57.175016 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bdpsg_openshift-multus(57f7e48b-7353-469c-ab9d-7f966c08d5f1)\"" pod="openshift-multus/multus-bdpsg" podUID="57f7e48b-7353-469c-ab9d-7f966c08d5f1" Dec 16 14:58:57 crc kubenswrapper[4728]: I1216 14:58:57.506089 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:57 crc kubenswrapper[4728]: I1216 14:58:57.506153 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:57 crc kubenswrapper[4728]: E1216 14:58:57.506246 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:57 crc kubenswrapper[4728]: E1216 14:58:57.506344 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:58 crc kubenswrapper[4728]: I1216 14:58:58.180741 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdpsg_57f7e48b-7353-469c-ab9d-7f966c08d5f1/kube-multus/1.log" Dec 16 14:58:58 crc kubenswrapper[4728]: I1216 14:58:58.505861 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:58:58 crc kubenswrapper[4728]: I1216 14:58:58.505883 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:58:58 crc kubenswrapper[4728]: E1216 14:58:58.506085 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:58:58 crc kubenswrapper[4728]: E1216 14:58:58.506700 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:58:58 crc kubenswrapper[4728]: I1216 14:58:58.507112 4728 scope.go:117] "RemoveContainer" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 14:58:58 crc kubenswrapper[4728]: E1216 14:58:58.507369 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2458v_openshift-ovn-kubernetes(480f8c1b-60cc-4685-86cc-a457f645e87c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" Dec 16 14:58:59 crc kubenswrapper[4728]: E1216 14:58:59.447184 4728 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 16 14:58:59 crc kubenswrapper[4728]: I1216 14:58:59.505687 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:58:59 crc kubenswrapper[4728]: I1216 14:58:59.505687 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:58:59 crc kubenswrapper[4728]: E1216 14:58:59.508124 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:58:59 crc kubenswrapper[4728]: E1216 14:58:59.508276 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:58:59 crc kubenswrapper[4728]: E1216 14:58:59.608100 4728 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 14:59:00 crc kubenswrapper[4728]: I1216 14:59:00.505914 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:00 crc kubenswrapper[4728]: I1216 14:59:00.505941 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:00 crc kubenswrapper[4728]: E1216 14:59:00.506331 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:59:00 crc kubenswrapper[4728]: E1216 14:59:00.506379 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:59:01 crc kubenswrapper[4728]: I1216 14:59:01.506298 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:01 crc kubenswrapper[4728]: I1216 14:59:01.506396 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:01 crc kubenswrapper[4728]: E1216 14:59:01.506528 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:59:01 crc kubenswrapper[4728]: E1216 14:59:01.506859 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:59:02 crc kubenswrapper[4728]: I1216 14:59:02.505703 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:02 crc kubenswrapper[4728]: I1216 14:59:02.505814 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:02 crc kubenswrapper[4728]: E1216 14:59:02.505898 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:59:02 crc kubenswrapper[4728]: E1216 14:59:02.506004 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:59:03 crc kubenswrapper[4728]: I1216 14:59:03.506303 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:03 crc kubenswrapper[4728]: E1216 14:59:03.506465 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:59:03 crc kubenswrapper[4728]: I1216 14:59:03.506597 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:03 crc kubenswrapper[4728]: E1216 14:59:03.506719 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:59:04 crc kubenswrapper[4728]: I1216 14:59:04.517944 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:04 crc kubenswrapper[4728]: E1216 14:59:04.518217 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:59:04 crc kubenswrapper[4728]: I1216 14:59:04.518456 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:04 crc kubenswrapper[4728]: E1216 14:59:04.518720 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:59:04 crc kubenswrapper[4728]: E1216 14:59:04.609775 4728 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 14:59:05 crc kubenswrapper[4728]: I1216 14:59:05.506390 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:05 crc kubenswrapper[4728]: I1216 14:59:05.506547 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:05 crc kubenswrapper[4728]: E1216 14:59:05.506631 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:59:05 crc kubenswrapper[4728]: E1216 14:59:05.506802 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:59:06 crc kubenswrapper[4728]: I1216 14:59:06.505455 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:06 crc kubenswrapper[4728]: I1216 14:59:06.505461 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:06 crc kubenswrapper[4728]: E1216 14:59:06.505694 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:59:06 crc kubenswrapper[4728]: E1216 14:59:06.505792 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:59:07 crc kubenswrapper[4728]: I1216 14:59:07.505740 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:07 crc kubenswrapper[4728]: I1216 14:59:07.505879 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:07 crc kubenswrapper[4728]: E1216 14:59:07.505980 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:59:07 crc kubenswrapper[4728]: E1216 14:59:07.506149 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:59:07 crc kubenswrapper[4728]: I1216 14:59:07.506906 4728 scope.go:117] "RemoveContainer" containerID="1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076" Dec 16 14:59:08 crc kubenswrapper[4728]: I1216 14:59:08.219319 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdpsg_57f7e48b-7353-469c-ab9d-7f966c08d5f1/kube-multus/1.log" Dec 16 14:59:08 crc kubenswrapper[4728]: I1216 14:59:08.219441 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdpsg" event={"ID":"57f7e48b-7353-469c-ab9d-7f966c08d5f1","Type":"ContainerStarted","Data":"e87cfd286c066fb2008d76673f5ffbf9c66c1224fbc2a064ad159a47c3c27d99"} Dec 16 14:59:08 crc kubenswrapper[4728]: I1216 14:59:08.506212 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:08 crc kubenswrapper[4728]: I1216 14:59:08.506249 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:08 crc kubenswrapper[4728]: E1216 14:59:08.506392 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:59:08 crc kubenswrapper[4728]: E1216 14:59:08.506546 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:59:09 crc kubenswrapper[4728]: I1216 14:59:09.505402 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:09 crc kubenswrapper[4728]: E1216 14:59:09.507459 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:59:09 crc kubenswrapper[4728]: I1216 14:59:09.507497 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:09 crc kubenswrapper[4728]: E1216 14:59:09.508370 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:59:09 crc kubenswrapper[4728]: I1216 14:59:09.508584 4728 scope.go:117] "RemoveContainer" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 14:59:09 crc kubenswrapper[4728]: E1216 14:59:09.610835 4728 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 16 14:59:10 crc kubenswrapper[4728]: I1216 14:59:10.232700 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/3.log" Dec 16 14:59:10 crc kubenswrapper[4728]: I1216 14:59:10.236441 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerStarted","Data":"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3"} Dec 16 14:59:10 crc kubenswrapper[4728]: I1216 14:59:10.237872 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:59:10 crc kubenswrapper[4728]: I1216 14:59:10.275918 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podStartSLOduration=107.275891171 podStartE2EDuration="1m47.275891171s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:10.273938128 +0000 UTC m=+131.114117122" watchObservedRunningTime="2025-12-16 14:59:10.275891171 +0000 UTC m=+131.116070205" Dec 16 14:59:10 crc kubenswrapper[4728]: I1216 14:59:10.392057 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kjxbh"] Dec 16 14:59:10 crc kubenswrapper[4728]: I1216 14:59:10.392154 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:10 crc kubenswrapper[4728]: E1216 14:59:10.392241 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:59:10 crc kubenswrapper[4728]: I1216 14:59:10.505637 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:10 crc kubenswrapper[4728]: I1216 14:59:10.505660 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:10 crc kubenswrapper[4728]: E1216 14:59:10.505764 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:59:10 crc kubenswrapper[4728]: E1216 14:59:10.505965 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:59:11 crc kubenswrapper[4728]: I1216 14:59:11.506276 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:11 crc kubenswrapper[4728]: E1216 14:59:11.506581 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:59:12 crc kubenswrapper[4728]: I1216 14:59:12.506258 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:12 crc kubenswrapper[4728]: I1216 14:59:12.506382 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:12 crc kubenswrapper[4728]: I1216 14:59:12.506295 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:12 crc kubenswrapper[4728]: E1216 14:59:12.506509 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:59:12 crc kubenswrapper[4728]: E1216 14:59:12.506896 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:59:12 crc kubenswrapper[4728]: E1216 14:59:12.507045 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:59:13 crc kubenswrapper[4728]: I1216 14:59:13.506024 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:13 crc kubenswrapper[4728]: E1216 14:59:13.506203 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 16 14:59:14 crc kubenswrapper[4728]: I1216 14:59:14.505959 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:14 crc kubenswrapper[4728]: I1216 14:59:14.506051 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:14 crc kubenswrapper[4728]: E1216 14:59:14.506148 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 16 14:59:14 crc kubenswrapper[4728]: I1216 14:59:14.506082 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:14 crc kubenswrapper[4728]: E1216 14:59:14.506300 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 16 14:59:14 crc kubenswrapper[4728]: E1216 14:59:14.506368 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kjxbh" podUID="d13ff897-af48-416f-ba3f-44f7e4344a75" Dec 16 14:59:15 crc kubenswrapper[4728]: I1216 14:59:15.506468 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:15 crc kubenswrapper[4728]: I1216 14:59:15.509387 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 16 14:59:15 crc kubenswrapper[4728]: I1216 14:59:15.509518 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 16 14:59:16 crc kubenswrapper[4728]: I1216 14:59:16.506066 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:16 crc kubenswrapper[4728]: I1216 14:59:16.506137 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:16 crc kubenswrapper[4728]: I1216 14:59:16.506263 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:16 crc kubenswrapper[4728]: I1216 14:59:16.509246 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 16 14:59:16 crc kubenswrapper[4728]: I1216 14:59:16.509301 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 16 14:59:16 crc kubenswrapper[4728]: I1216 14:59:16.509802 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 16 14:59:16 crc kubenswrapper[4728]: I1216 14:59:16.509806 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.220651 4728 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.268264 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7rjsg"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.268869 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.273156 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-25mqz"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.273795 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.273872 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.274197 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.276027 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.276372 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.277072 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.277518 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.278373 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.278486 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.278695 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.278725 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.278924 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.279054 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.279283 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.279486 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.279924 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.280161 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.280327 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.280871 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.281014 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.281596 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.281893 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.282047 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.282340 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.282470 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.283039 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.283288 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.283392 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.284401 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ncw5z"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.284929 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.285839 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.286287 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.287312 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h6f6v"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.287776 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.301735 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.302135 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.302443 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.302718 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.302738 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.302849 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.302915 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.317221 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.326558 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.326692 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.326563 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.326980 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.327159 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.327483 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.327557 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.327705 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.327912 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.327966 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.327913 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.328040 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.328090 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.328547 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.328973 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.329166 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.329293 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.329375 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.329463 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.329495 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.329608 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.329692 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.329776 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.329785 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.329848 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.329920 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.330240 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.330367 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.330526 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.330638 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.330722 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.330803 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.330897 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.333622 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hnrqv"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.334052 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.334113 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.334628 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335507 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83befb95-6263-4eb5-85f9-7d061e73a0f4-config\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335539 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335558 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335584 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-policies\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335600 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-encryption-config\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335614 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-config\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335628 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a752efc5-d365-4774-a134-f2199e58d26e-machine-approver-tls\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335643 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b7a24aa5-4c07-4f1f-984a-c3e4a73231e4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-25mqz\" (UID: \"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335660 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a752efc5-d365-4774-a134-f2199e58d26e-auth-proxy-config\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335685 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-etcd-serving-ca\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335763 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-audit-dir\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335799 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335817 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stl4x\" (UniqueName: \"kubernetes.io/projected/a752efc5-d365-4774-a134-f2199e58d26e-kube-api-access-stl4x\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335834 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-serving-cert\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335871 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-client-ca\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335890 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83befb95-6263-4eb5-85f9-7d061e73a0f4-service-ca-bundle\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335920 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d1c9711-c6ae-4b4a-bafb-07891dd80514-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tznx9\" (UID: \"4d1c9711-c6ae-4b4a-bafb-07891dd80514\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335936 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06463dfe-cbd3-45df-847a-732f66305f9a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jwssj\" (UID: \"06463dfe-cbd3-45df-847a-732f66305f9a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335963 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-dir\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335981 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.335998 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336015 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83befb95-6263-4eb5-85f9-7d061e73a0f4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336104 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-image-import-ca\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336128 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336171 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d57p\" (UniqueName: \"kubernetes.io/projected/06463dfe-cbd3-45df-847a-732f66305f9a-kube-api-access-6d57p\") pod \"openshift-controller-manager-operator-756b6f6bc6-jwssj\" (UID: \"06463dfe-cbd3-45df-847a-732f66305f9a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336238 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-serving-cert\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336317 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-audit\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336357 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06463dfe-cbd3-45df-847a-732f66305f9a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jwssj\" (UID: \"06463dfe-cbd3-45df-847a-732f66305f9a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336389 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336506 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bv4p\" (UniqueName: \"kubernetes.io/projected/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-kube-api-access-7bv4p\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336552 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8s9z\" (UniqueName: \"kubernetes.io/projected/83befb95-6263-4eb5-85f9-7d061e73a0f4-kube-api-access-h8s9z\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336575 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336590 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336645 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc2nz\" (UniqueName: \"kubernetes.io/projected/b7a24aa5-4c07-4f1f-984a-c3e4a73231e4-kube-api-access-wc2nz\") pod \"openshift-config-operator-7777fb866f-25mqz\" (UID: \"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336674 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83befb95-6263-4eb5-85f9-7d061e73a0f4-serving-cert\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336709 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a752efc5-d365-4774-a134-f2199e58d26e-config\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336739 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2344bb62-ca23-4655-80b7-04fd2f766b9f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ldfwg\" (UID: \"2344bb62-ca23-4655-80b7-04fd2f766b9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336768 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2344bb62-ca23-4655-80b7-04fd2f766b9f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ldfwg\" (UID: \"2344bb62-ca23-4655-80b7-04fd2f766b9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336821 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336850 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-config\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336883 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336917 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.336915 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzlzn\" (UniqueName: \"kubernetes.io/projected/eeff725e-9dab-4bec-99f6-8105af9b3b6c-kube-api-access-jzlzn\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337047 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpj4s\" (UniqueName: \"kubernetes.io/projected/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-kube-api-access-bpj4s\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337090 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxnsm\" (UniqueName: \"kubernetes.io/projected/2344bb62-ca23-4655-80b7-04fd2f766b9f-kube-api-access-wxnsm\") pod \"openshift-apiserver-operator-796bbdcf4f-ldfwg\" (UID: \"2344bb62-ca23-4655-80b7-04fd2f766b9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337114 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337137 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337178 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfbs8\" (UniqueName: \"kubernetes.io/projected/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-kube-api-access-sfbs8\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337205 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337226 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-node-pullsecrets\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337246 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-etcd-client\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337267 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337289 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337317 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxkpm\" (UniqueName: \"kubernetes.io/projected/4d1c9711-c6ae-4b4a-bafb-07891dd80514-kube-api-access-gxkpm\") pod \"cluster-samples-operator-665b6dd947-tznx9\" (UID: \"4d1c9711-c6ae-4b4a-bafb-07891dd80514\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337339 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337354 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337362 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7a24aa5-4c07-4f1f-984a-c3e4a73231e4-serving-cert\") pod \"openshift-config-operator-7777fb866f-25mqz\" (UID: \"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.337496 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pfz7w"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.338123 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-pr5wl"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.338201 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.338659 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.339456 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-x6b6q"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.339991 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m9gpr"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.340606 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.340772 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.340933 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x6b6q" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.341718 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.342695 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.344453 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-phlkm"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.344962 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.345186 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hw5mk"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.345851 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.347294 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.349129 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7rjsg"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.349170 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.349502 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.349578 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.350184 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.350680 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.350805 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.351127 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.364802 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ggsk6"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.368092 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.368115 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.368089 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.368209 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.368240 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.368683 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.368682 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.386317 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.386806 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.387221 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.387696 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.387883 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.389692 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.389859 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.390109 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.390132 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.390149 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.390293 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.390520 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ncw5z"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.390594 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.390682 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.392786 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.393064 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.394914 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.395062 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.395224 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.395278 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.395383 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.395519 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.395629 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.395847 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.395965 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.396592 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.396610 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.396650 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.396750 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.397106 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.397265 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.397453 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.398601 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kqbmn"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.399308 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.399428 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.399495 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.399639 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.399728 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.401278 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4tpkr"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.401695 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.402046 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.402301 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.402908 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.403600 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.403632 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.404155 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.405220 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.407545 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.408077 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-m9d45"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.408555 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.408698 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.409008 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h6f6v"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.410460 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g66rv"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.410943 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.411318 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.411549 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.411577 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.411822 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.412692 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.412800 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.414259 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hnrqv"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.415959 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-25mqz"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.420536 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k6z5"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.421147 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.421254 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.422523 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.423214 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.423224 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.423761 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.426278 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-55pkk"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.426876 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.427854 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.428281 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.429483 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.438373 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hw5mk"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.438908 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d1c9711-c6ae-4b4a-bafb-07891dd80514-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tznx9\" (UID: \"4d1c9711-c6ae-4b4a-bafb-07891dd80514\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.439090 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/179db76a-61fb-4ae3-a494-be67c44c7d65-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.439215 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06463dfe-cbd3-45df-847a-732f66305f9a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jwssj\" (UID: \"06463dfe-cbd3-45df-847a-732f66305f9a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.439350 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-dir\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.439487 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.439797 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.439904 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83befb95-6263-4eb5-85f9-7d061e73a0f4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.439999 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06463dfe-cbd3-45df-847a-732f66305f9a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jwssj\" (UID: \"06463dfe-cbd3-45df-847a-732f66305f9a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.440011 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/179db76a-61fb-4ae3-a494-be67c44c7d65-encryption-config\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.440195 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-image-import-ca\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.440295 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d57p\" (UniqueName: \"kubernetes.io/projected/06463dfe-cbd3-45df-847a-732f66305f9a-kube-api-access-6d57p\") pod \"openshift-controller-manager-operator-756b6f6bc6-jwssj\" (UID: \"06463dfe-cbd3-45df-847a-732f66305f9a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.440433 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-serving-cert\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.440550 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-audit\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.440657 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06463dfe-cbd3-45df-847a-732f66305f9a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jwssj\" (UID: \"06463dfe-cbd3-45df-847a-732f66305f9a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.440849 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.440980 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bv4p\" (UniqueName: \"kubernetes.io/projected/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-kube-api-access-7bv4p\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.441082 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.441181 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8s9z\" (UniqueName: \"kubernetes.io/projected/83befb95-6263-4eb5-85f9-7d061e73a0f4-kube-api-access-h8s9z\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.441303 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/179db76a-61fb-4ae3-a494-be67c44c7d65-etcd-client\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.441495 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c809652e-9ba8-4499-b2c5-4a0015048ea0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-49d5p\" (UID: \"c809652e-9ba8-4499-b2c5-4a0015048ea0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.441599 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c809652e-9ba8-4499-b2c5-4a0015048ea0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-49d5p\" (UID: \"c809652e-9ba8-4499-b2c5-4a0015048ea0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.441695 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ee42482-d554-4944-8fce-e503c10c0ec9-serving-cert\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.441805 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc2nz\" (UniqueName: \"kubernetes.io/projected/b7a24aa5-4c07-4f1f-984a-c3e4a73231e4-kube-api-access-wc2nz\") pod \"openshift-config-operator-7777fb866f-25mqz\" (UID: \"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.442328 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-dir\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.442344 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83befb95-6263-4eb5-85f9-7d061e73a0f4-serving-cert\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.442695 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ee42482-d554-4944-8fce-e503c10c0ec9-etcd-client\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.442840 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88vsn\" (UniqueName: \"kubernetes.io/projected/5ee42482-d554-4944-8fce-e503c10c0ec9-kube-api-access-88vsn\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.443000 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b5jn\" (UniqueName: \"kubernetes.io/projected/179db76a-61fb-4ae3-a494-be67c44c7d65-kube-api-access-7b5jn\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.443248 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a752efc5-d365-4774-a134-f2199e58d26e-config\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.444274 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2344bb62-ca23-4655-80b7-04fd2f766b9f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ldfwg\" (UID: \"2344bb62-ca23-4655-80b7-04fd2f766b9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.444684 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5ee42482-d554-4944-8fce-e503c10c0ec9-etcd-ca\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.444795 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99bk8\" (UniqueName: \"kubernetes.io/projected/44dea519-de7c-4bc5-934e-5c251177b6fd-kube-api-access-99bk8\") pod \"migrator-59844c95c7-l9s9x\" (UID: \"44dea519-de7c-4bc5-934e-5c251177b6fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.445004 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2344bb62-ca23-4655-80b7-04fd2f766b9f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ldfwg\" (UID: \"2344bb62-ca23-4655-80b7-04fd2f766b9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.445133 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.445236 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179db76a-61fb-4ae3-a494-be67c44c7d65-serving-cert\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.445697 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-audit\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.445343 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-config\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.445855 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzlzn\" (UniqueName: \"kubernetes.io/projected/eeff725e-9dab-4bec-99f6-8105af9b3b6c-kube-api-access-jzlzn\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.445887 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpj4s\" (UniqueName: \"kubernetes.io/projected/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-kube-api-access-bpj4s\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.445927 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446014 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxnsm\" (UniqueName: \"kubernetes.io/projected/2344bb62-ca23-4655-80b7-04fd2f766b9f-kube-api-access-wxnsm\") pod \"openshift-apiserver-operator-796bbdcf4f-ldfwg\" (UID: \"2344bb62-ca23-4655-80b7-04fd2f766b9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446535 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446623 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfbs8\" (UniqueName: \"kubernetes.io/projected/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-kube-api-access-sfbs8\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446703 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446728 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446782 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrv7f\" (UniqueName: \"kubernetes.io/projected/7a82fd60-88d6-4e1a-b140-bf2f92ea0a87-kube-api-access-zrv7f\") pod \"downloads-7954f5f757-x6b6q\" (UID: \"7a82fd60-88d6-4e1a-b140-bf2f92ea0a87\") " pod="openshift-console/downloads-7954f5f757-x6b6q" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446802 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-etcd-client\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446858 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446883 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446926 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446937 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxkpm\" (UniqueName: \"kubernetes.io/projected/4d1c9711-c6ae-4b4a-bafb-07891dd80514-kube-api-access-gxkpm\") pod \"cluster-samples-operator-665b6dd947-tznx9\" (UID: \"4d1c9711-c6ae-4b4a-bafb-07891dd80514\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446958 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.446960 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-node-pullsecrets\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447011 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/179db76a-61fb-4ae3-a494-be67c44c7d65-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447040 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7a24aa5-4c07-4f1f-984a-c3e4a73231e4-serving-cert\") pod \"openshift-config-operator-7777fb866f-25mqz\" (UID: \"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447092 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447117 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ee42482-d554-4944-8fce-e503c10c0ec9-etcd-service-ca\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447132 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a752efc5-d365-4774-a134-f2199e58d26e-config\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447166 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83befb95-6263-4eb5-85f9-7d061e73a0f4-config\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447231 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447290 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447326 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-policies\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447358 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee42482-d554-4944-8fce-e503c10c0ec9-config\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447387 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-encryption-config\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447428 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-config\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447451 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b7a24aa5-4c07-4f1f-984a-c3e4a73231e4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-25mqz\" (UID: \"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447473 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a752efc5-d365-4774-a134-f2199e58d26e-auth-proxy-config\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447493 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a752efc5-d365-4774-a134-f2199e58d26e-machine-approver-tls\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447516 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-etcd-serving-ca\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447539 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/179db76a-61fb-4ae3-a494-be67c44c7d65-audit-policies\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447582 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-audit-dir\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447605 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447627 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stl4x\" (UniqueName: \"kubernetes.io/projected/a752efc5-d365-4774-a134-f2199e58d26e-kube-api-access-stl4x\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447645 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-serving-cert\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447673 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-client-ca\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447696 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c809652e-9ba8-4499-b2c5-4a0015048ea0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-49d5p\" (UID: \"c809652e-9ba8-4499-b2c5-4a0015048ea0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447718 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/179db76a-61fb-4ae3-a494-be67c44c7d65-audit-dir\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.447742 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83befb95-6263-4eb5-85f9-7d061e73a0f4-service-ca-bundle\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.448807 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83befb95-6263-4eb5-85f9-7d061e73a0f4-service-ca-bundle\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.449018 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06463dfe-cbd3-45df-847a-732f66305f9a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jwssj\" (UID: \"06463dfe-cbd3-45df-847a-732f66305f9a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.449330 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2344bb62-ca23-4655-80b7-04fd2f766b9f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ldfwg\" (UID: \"2344bb62-ca23-4655-80b7-04fd2f766b9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.450009 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-image-import-ca\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.450015 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-config\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.450243 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.450369 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83befb95-6263-4eb5-85f9-7d061e73a0f4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.450377 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83befb95-6263-4eb5-85f9-7d061e73a0f4-config\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.450459 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x6b6q"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.442581 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.450970 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.451185 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a752efc5-d365-4774-a134-f2199e58d26e-auth-proxy-config\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.451910 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.452087 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d1c9711-c6ae-4b4a-bafb-07891dd80514-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tznx9\" (UID: \"4d1c9711-c6ae-4b4a-bafb-07891dd80514\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.452883 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.453885 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.454249 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-client-ca\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.454392 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83befb95-6263-4eb5-85f9-7d061e73a0f4-serving-cert\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.454093 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-node-pullsecrets\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.468556 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-config\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.468563 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b7a24aa5-4c07-4f1f-984a-c3e4a73231e4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-25mqz\" (UID: \"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.474110 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2344bb62-ca23-4655-80b7-04fd2f766b9f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ldfwg\" (UID: \"2344bb62-ca23-4655-80b7-04fd2f766b9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.474645 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.475645 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.475810 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-etcd-serving-ca\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.475980 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.476113 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.476530 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a752efc5-d365-4774-a134-f2199e58d26e-machine-approver-tls\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.476812 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.477231 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.477827 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.477936 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-policies\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.477984 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-audit-dir\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.478270 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.478765 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.478930 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-serving-cert\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.479169 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pfz7w"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.480205 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.480369 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-etcd-client\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.480512 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-encryption-config\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.480697 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.481052 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.481184 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.481251 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-serving-cert\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.482128 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.482644 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pr5wl"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.483796 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7a24aa5-4c07-4f1f-984a-c3e4a73231e4-serving-cert\") pod \"openshift-config-operator-7777fb866f-25mqz\" (UID: \"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.485418 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-phlkm"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.489388 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.490522 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kqbmn"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.490865 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.491826 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m9gpr"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.493484 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.495233 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.495325 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.497672 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.497920 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.499265 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.500611 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4tpkr"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.501740 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.503169 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.504454 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ggsk6"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.505755 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m9d45"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.507361 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g66rv"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.508905 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k6z5"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.510365 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5t99c"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.510421 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.511384 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.511551 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-87x9r"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.511914 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-87x9r" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.512632 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.513766 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.516289 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-87x9r"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.517898 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5t99c"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.519037 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mf9mx"] Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.519939 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mf9mx" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.530445 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.548400 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrv7f\" (UniqueName: \"kubernetes.io/projected/7a82fd60-88d6-4e1a-b140-bf2f92ea0a87-kube-api-access-zrv7f\") pod \"downloads-7954f5f757-x6b6q\" (UID: \"7a82fd60-88d6-4e1a-b140-bf2f92ea0a87\") " pod="openshift-console/downloads-7954f5f757-x6b6q" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.548451 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/179db76a-61fb-4ae3-a494-be67c44c7d65-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.548470 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ee42482-d554-4944-8fce-e503c10c0ec9-etcd-service-ca\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.548501 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee42482-d554-4944-8fce-e503c10c0ec9-config\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.548520 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/179db76a-61fb-4ae3-a494-be67c44c7d65-audit-policies\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.548575 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/179db76a-61fb-4ae3-a494-be67c44c7d65-audit-dir\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.548596 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c809652e-9ba8-4499-b2c5-4a0015048ea0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-49d5p\" (UID: \"c809652e-9ba8-4499-b2c5-4a0015048ea0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.548658 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/179db76a-61fb-4ae3-a494-be67c44c7d65-audit-dir\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.548613 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/179db76a-61fb-4ae3-a494-be67c44c7d65-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.548710 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/179db76a-61fb-4ae3-a494-be67c44c7d65-encryption-config\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549164 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/179db76a-61fb-4ae3-a494-be67c44c7d65-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549201 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/179db76a-61fb-4ae3-a494-be67c44c7d65-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549228 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c809652e-9ba8-4499-b2c5-4a0015048ea0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-49d5p\" (UID: \"c809652e-9ba8-4499-b2c5-4a0015048ea0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549244 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ee42482-d554-4944-8fce-e503c10c0ec9-serving-cert\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549282 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/179db76a-61fb-4ae3-a494-be67c44c7d65-etcd-client\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549297 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c809652e-9ba8-4499-b2c5-4a0015048ea0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-49d5p\" (UID: \"c809652e-9ba8-4499-b2c5-4a0015048ea0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549386 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ee42482-d554-4944-8fce-e503c10c0ec9-etcd-client\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549425 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88vsn\" (UniqueName: \"kubernetes.io/projected/5ee42482-d554-4944-8fce-e503c10c0ec9-kube-api-access-88vsn\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549441 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5ee42482-d554-4944-8fce-e503c10c0ec9-etcd-ca\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549457 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b5jn\" (UniqueName: \"kubernetes.io/projected/179db76a-61fb-4ae3-a494-be67c44c7d65-kube-api-access-7b5jn\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549474 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99bk8\" (UniqueName: \"kubernetes.io/projected/44dea519-de7c-4bc5-934e-5c251177b6fd-kube-api-access-99bk8\") pod \"migrator-59844c95c7-l9s9x\" (UID: \"44dea519-de7c-4bc5-934e-5c251177b6fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549498 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179db76a-61fb-4ae3-a494-be67c44c7d65-serving-cert\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.549634 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/179db76a-61fb-4ae3-a494-be67c44c7d65-audit-policies\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.550442 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.552271 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/179db76a-61fb-4ae3-a494-be67c44c7d65-etcd-client\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.552957 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/179db76a-61fb-4ae3-a494-be67c44c7d65-encryption-config\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.555210 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/179db76a-61fb-4ae3-a494-be67c44c7d65-serving-cert\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.570551 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.590265 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.610632 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.621964 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c809652e-9ba8-4499-b2c5-4a0015048ea0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-49d5p\" (UID: \"c809652e-9ba8-4499-b2c5-4a0015048ea0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.631168 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.640509 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c809652e-9ba8-4499-b2c5-4a0015048ea0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-49d5p\" (UID: \"c809652e-9ba8-4499-b2c5-4a0015048ea0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.650661 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.670869 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.695188 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.711474 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.731768 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.752351 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.771736 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.801925 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.810884 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.831007 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.850995 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.871962 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.891759 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.911130 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.931877 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.952888 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.963944 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ee42482-d554-4944-8fce-e503c10c0ec9-serving-cert\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.972402 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.984254 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ee42482-d554-4944-8fce-e503c10c0ec9-etcd-client\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:22 crc kubenswrapper[4728]: I1216 14:59:22.991996 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.011350 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.020039 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee42482-d554-4944-8fce-e503c10c0ec9-config\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.031074 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.053852 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.061149 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5ee42482-d554-4944-8fce-e503c10c0ec9-etcd-ca\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.091308 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.111267 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.133183 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.140332 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5ee42482-d554-4944-8fce-e503c10c0ec9-etcd-service-ca\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.152556 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.171281 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.191457 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.211339 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.231332 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.252005 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.271483 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.291452 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.311563 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.331911 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.351824 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.371793 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.391373 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.409765 4728 request.go:700] Waited for 1.003789881s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/secrets?fieldSelector=metadata.name%3Dkube-controller-manager-operator-serving-cert&limit=500&resourceVersion=0 Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.411780 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.431689 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.452014 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.471270 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.492130 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.514189 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.532302 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.551349 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.572028 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.591159 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.611690 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.632162 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.651565 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.671369 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.692211 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.712009 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.731924 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.751986 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.772104 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.811960 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.830812 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.851460 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.871840 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.907972 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.911635 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.932598 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.952241 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.971853 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 16 14:59:23 crc kubenswrapper[4728]: I1216 14:59:23.992066 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.011193 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.031573 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.051575 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.071311 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.091886 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.111340 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.188900 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bv4p\" (UniqueName: \"kubernetes.io/projected/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-kube-api-access-7bv4p\") pod \"route-controller-manager-6576b87f9c-jfcxb\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.210287 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc2nz\" (UniqueName: \"kubernetes.io/projected/b7a24aa5-4c07-4f1f-984a-c3e4a73231e4-kube-api-access-wc2nz\") pod \"openshift-config-operator-7777fb866f-25mqz\" (UID: \"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.212070 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d57p\" (UniqueName: \"kubernetes.io/projected/06463dfe-cbd3-45df-847a-732f66305f9a-kube-api-access-6d57p\") pod \"openshift-controller-manager-operator-756b6f6bc6-jwssj\" (UID: \"06463dfe-cbd3-45df-847a-732f66305f9a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.217651 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.228911 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxnsm\" (UniqueName: \"kubernetes.io/projected/2344bb62-ca23-4655-80b7-04fd2f766b9f-kube-api-access-wxnsm\") pod \"openshift-apiserver-operator-796bbdcf4f-ldfwg\" (UID: \"2344bb62-ca23-4655-80b7-04fd2f766b9f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.229080 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.238120 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzlzn\" (UniqueName: \"kubernetes.io/projected/eeff725e-9dab-4bec-99f6-8105af9b3b6c-kube-api-access-jzlzn\") pod \"oauth-openshift-558db77b4-h6f6v\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.251203 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8s9z\" (UniqueName: \"kubernetes.io/projected/83befb95-6263-4eb5-85f9-7d061e73a0f4-kube-api-access-h8s9z\") pod \"authentication-operator-69f744f599-ncw5z\" (UID: \"83befb95-6263-4eb5-85f9-7d061e73a0f4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.267305 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpj4s\" (UniqueName: \"kubernetes.io/projected/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-kube-api-access-bpj4s\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.288399 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2cfce66-2c6c-41ad-84f6-6726c3ea088d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q8nqs\" (UID: \"f2cfce66-2c6c-41ad-84f6-6726c3ea088d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.298313 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.307723 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.308313 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stl4x\" (UniqueName: \"kubernetes.io/projected/a752efc5-d365-4774-a134-f2199e58d26e-kube-api-access-stl4x\") pod \"machine-approver-56656f9798-p2hzh\" (UID: \"a752efc5-d365-4774-a134-f2199e58d26e\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.313130 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.329937 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxkpm\" (UniqueName: \"kubernetes.io/projected/4d1c9711-c6ae-4b4a-bafb-07891dd80514-kube-api-access-gxkpm\") pod \"cluster-samples-operator-665b6dd947-tznx9\" (UID: \"4d1c9711-c6ae-4b4a-bafb-07891dd80514\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.345774 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfbs8\" (UniqueName: \"kubernetes.io/projected/6f4e29bc-31e9-421e-abdf-c692d1dfb0c7-kube-api-access-sfbs8\") pod \"apiserver-76f77b778f-7rjsg\" (UID: \"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7\") " pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.352048 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.371658 4728 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.391574 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.410937 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg"] Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.411163 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.429619 4728 request.go:700] Waited for 1.917518348s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-2llfx&limit=500&resourceVersion=0 Dec 16 14:59:24 crc kubenswrapper[4728]: W1216 14:59:24.431263 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2344bb62_ca23_4655_80b7_04fd2f766b9f.slice/crio-63845ac1a414cff2c654d387622557dd654b8f18bfa6b2691854a0326dbf2c30 WatchSource:0}: Error finding container 63845ac1a414cff2c654d387622557dd654b8f18bfa6b2691854a0326dbf2c30: Status 404 returned error can't find the container with id 63845ac1a414cff2c654d387622557dd654b8f18bfa6b2691854a0326dbf2c30 Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.431348 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.431553 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.451592 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.465733 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.471999 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.491720 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.493901 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.506113 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.512257 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.530029 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ncw5z"] Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.534537 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.541843 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.553093 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h6f6v"] Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.567669 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrv7f\" (UniqueName: \"kubernetes.io/projected/7a82fd60-88d6-4e1a-b140-bf2f92ea0a87-kube-api-access-zrv7f\") pod \"downloads-7954f5f757-x6b6q\" (UID: \"7a82fd60-88d6-4e1a-b140-bf2f92ea0a87\") " pod="openshift-console/downloads-7954f5f757-x6b6q" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.582879 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb"] Dec 16 14:59:24 crc kubenswrapper[4728]: W1216 14:59:24.583086 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83befb95_6263_4eb5_85f9_7d061e73a0f4.slice/crio-b23822adf8118f086b351b98a648862c8562c9abd4e82db9408c6cbdca849316 WatchSource:0}: Error finding container b23822adf8118f086b351b98a648862c8562c9abd4e82db9408c6cbdca849316: Status 404 returned error can't find the container with id b23822adf8118f086b351b98a648862c8562c9abd4e82db9408c6cbdca849316 Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.599884 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c809652e-9ba8-4499-b2c5-4a0015048ea0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-49d5p\" (UID: \"c809652e-9ba8-4499-b2c5-4a0015048ea0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.609400 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88vsn\" (UniqueName: \"kubernetes.io/projected/5ee42482-d554-4944-8fce-e503c10c0ec9-kube-api-access-88vsn\") pod \"etcd-operator-b45778765-ggsk6\" (UID: \"5ee42482-d554-4944-8fce-e503c10c0ec9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:24 crc kubenswrapper[4728]: W1216 14:59:24.623027 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2555bf9d_eb8f_4c07_b78a_8aec21d16b75.slice/crio-af3907fb59a249bceefca1f6bb2d89bbd71e11b94a029843613d620382976431 WatchSource:0}: Error finding container af3907fb59a249bceefca1f6bb2d89bbd71e11b94a029843613d620382976431: Status 404 returned error can't find the container with id af3907fb59a249bceefca1f6bb2d89bbd71e11b94a029843613d620382976431 Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.628678 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b5jn\" (UniqueName: \"kubernetes.io/projected/179db76a-61fb-4ae3-a494-be67c44c7d65-kube-api-access-7b5jn\") pod \"apiserver-7bbb656c7d-bh4dl\" (UID: \"179db76a-61fb-4ae3-a494-be67c44c7d65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.651591 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj"] Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.651732 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x6b6q" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.656281 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99bk8\" (UniqueName: \"kubernetes.io/projected/44dea519-de7c-4bc5-934e-5c251177b6fd-kube-api-access-99bk8\") pod \"migrator-59844c95c7-l9s9x\" (UID: \"44dea519-de7c-4bc5-934e-5c251177b6fd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.673901 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.680091 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7rjsg"] Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.686979 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4mbw\" (UniqueName: \"kubernetes.io/projected/a58bcc67-9370-45bf-a2b4-96dec3c76b3f-kube-api-access-f4mbw\") pod \"multus-admission-controller-857f4d67dd-kqbmn\" (UID: \"a58bcc67-9370-45bf-a2b4-96dec3c76b3f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687039 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-trusted-ca-bundle\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687253 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60329838-5948-4dc3-8ff1-cf2951d99197-proxy-tls\") pod \"machine-config-controller-84d6567774-rkm4q\" (UID: \"60329838-5948-4dc3-8ff1-cf2951d99197\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687298 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfd7m\" (UniqueName: \"kubernetes.io/projected/4bebf51e-0577-4143-b337-f63a04d6a73d-kube-api-access-jfd7m\") pod \"catalog-operator-68c6474976-qnflb\" (UID: \"4bebf51e-0577-4143-b337-f63a04d6a73d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687315 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blhj5\" (UniqueName: \"kubernetes.io/projected/7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700-kube-api-access-blhj5\") pod \"dns-operator-744455d44c-4tpkr\" (UID: \"7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700\") " pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687362 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a5f01333-4573-4a05-b1cf-5dfdc95a33cd-signing-cabundle\") pod \"service-ca-9c57cc56f-hw5mk\" (UID: \"a5f01333-4573-4a05-b1cf-5dfdc95a33cd\") " pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687380 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b2dba4-0e5d-45df-9d96-08f13c21c06a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-68vs6\" (UID: \"f9b2dba4-0e5d-45df-9d96-08f13c21c06a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687396 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4bebf51e-0577-4143-b337-f63a04d6a73d-profile-collector-cert\") pod \"catalog-operator-68c6474976-qnflb\" (UID: \"4bebf51e-0577-4143-b337-f63a04d6a73d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687431 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzvzd\" (UniqueName: \"kubernetes.io/projected/db457bae-59bc-4ec6-b5dd-8699c5794f76-kube-api-access-tzvzd\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687538 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/60329838-5948-4dc3-8ff1-cf2951d99197-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rkm4q\" (UID: \"60329838-5948-4dc3-8ff1-cf2951d99197\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687563 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/269fe7e0-633b-41d4-8a8f-cd39424229e4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687630 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75ced0f-eb7d-4781-a34a-358c9e5db98a-config\") pod \"kube-apiserver-operator-766d6c64bb-nt6nd\" (UID: \"e75ced0f-eb7d-4781-a34a-358c9e5db98a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687648 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-config\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687896 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971f1003-73c5-40ca-8c3a-5479215b4e72-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cvcc\" (UID: \"971f1003-73c5-40ca-8c3a-5479215b4e72\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687930 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/693eed37-ae85-4a0b-a3e3-4e908245ac13-config\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.687969 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.688076 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75ced0f-eb7d-4781-a34a-358c9e5db98a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nt6nd\" (UID: \"e75ced0f-eb7d-4781-a34a-358c9e5db98a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.688130 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nwd4\" (UniqueName: \"kubernetes.io/projected/7d652ee4-8157-4667-a521-c75855e26796-kube-api-access-7nwd4\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.691283 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkn7q\" (UniqueName: \"kubernetes.io/projected/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-kube-api-access-wkn7q\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.691340 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d652ee4-8157-4667-a521-c75855e26796-trusted-ca\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.691850 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e75ced0f-eb7d-4781-a34a-358c9e5db98a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nt6nd\" (UID: \"e75ced0f-eb7d-4781-a34a-358c9e5db98a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.691996 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-service-ca\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.692029 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d652ee4-8157-4667-a521-c75855e26796-metrics-tls\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.692174 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsbnl\" (UniqueName: \"kubernetes.io/projected/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-kube-api-access-xsbnl\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.692204 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8k6z5\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.692342 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8b4426a-81fb-42cc-92ff-9f488d660820-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-drpzn\" (UID: \"f8b4426a-81fb-42cc-92ff-9f488d660820\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.692369 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l87z5\" (UniqueName: \"kubernetes.io/projected/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-kube-api-access-l87z5\") pod \"collect-profiles-29431605-hwwmj\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.693816 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-serving-cert\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.693866 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vjnr\" (UniqueName: \"kubernetes.io/projected/510146d5-b80f-404e-9aac-8e06a20a0c44-kube-api-access-6vjnr\") pod \"service-ca-operator-777779d784-g66rv\" (UID: \"510146d5-b80f-404e-9aac-8e06a20a0c44\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.693892 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f6qb\" (UniqueName: \"kubernetes.io/projected/971f1003-73c5-40ca-8c3a-5479215b4e72-kube-api-access-7f6qb\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cvcc\" (UID: \"971f1003-73c5-40ca-8c3a-5479215b4e72\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.693913 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700-metrics-tls\") pod \"dns-operator-744455d44c-4tpkr\" (UID: \"7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700\") " pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.694004 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-trusted-ca\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.694260 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.694622 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a5f01333-4573-4a05-b1cf-5dfdc95a33cd-signing-key\") pod \"service-ca-9c57cc56f-hw5mk\" (UID: \"a5f01333-4573-4a05-b1cf-5dfdc95a33cd\") " pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.694656 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-images\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.694705 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-secret-volume\") pod \"collect-profiles-29431605-hwwmj\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.694730 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/693eed37-ae85-4a0b-a3e3-4e908245ac13-serving-cert\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.694794 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-certificates\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.694819 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971f1003-73c5-40ca-8c3a-5479215b4e72-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cvcc\" (UID: \"971f1003-73c5-40ca-8c3a-5479215b4e72\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.694893 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm89x\" (UniqueName: \"kubernetes.io/projected/ca945ba8-363c-4e60-b11a-6938e4cb9354-kube-api-access-dm89x\") pod \"marketplace-operator-79b997595-8k6z5\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.694985 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-config\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695055 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/269fe7e0-633b-41d4-8a8f-cd39424229e4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695358 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckd94\" (UniqueName: \"kubernetes.io/projected/f9b2dba4-0e5d-45df-9d96-08f13c21c06a-kube-api-access-ckd94\") pod \"package-server-manager-789f6589d5-68vs6\" (UID: \"f9b2dba4-0e5d-45df-9d96-08f13c21c06a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695379 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/693eed37-ae85-4a0b-a3e3-4e908245ac13-trusted-ca\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695433 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d652ee4-8157-4667-a521-c75855e26796-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695627 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpmlk\" (UniqueName: \"kubernetes.io/projected/95388006-228d-47cb-ab64-42cea04840bc-kube-api-access-kpmlk\") pod \"dns-default-m9d45\" (UID: \"95388006-228d-47cb-ab64-42cea04840bc\") " pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695665 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/510146d5-b80f-404e-9aac-8e06a20a0c44-config\") pod \"service-ca-operator-777779d784-g66rv\" (UID: \"510146d5-b80f-404e-9aac-8e06a20a0c44\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695682 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a58bcc67-9370-45bf-a2b4-96dec3c76b3f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kqbmn\" (UID: \"a58bcc67-9370-45bf-a2b4-96dec3c76b3f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695700 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-bound-sa-token\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695715 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/510146d5-b80f-404e-9aac-8e06a20a0c44-serving-cert\") pod \"service-ca-operator-777779d784-g66rv\" (UID: \"510146d5-b80f-404e-9aac-8e06a20a0c44\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695732 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwzvp\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-kube-api-access-pwzvp\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695746 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/95388006-228d-47cb-ab64-42cea04840bc-metrics-tls\") pod \"dns-default-m9d45\" (UID: \"95388006-228d-47cb-ab64-42cea04840bc\") " pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695761 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d9cb\" (UniqueName: \"kubernetes.io/projected/693eed37-ae85-4a0b-a3e3-4e908245ac13-kube-api-access-6d9cb\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695805 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph49x\" (UniqueName: \"kubernetes.io/projected/a5f01333-4573-4a05-b1cf-5dfdc95a33cd-kube-api-access-ph49x\") pod \"service-ca-9c57cc56f-hw5mk\" (UID: \"a5f01333-4573-4a05-b1cf-5dfdc95a33cd\") " pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" Dec 16 14:59:24 crc kubenswrapper[4728]: E1216 14:59:24.698042 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:25.198025473 +0000 UTC m=+146.038204547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.698285 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-25mqz"] Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.695980 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701565 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4bebf51e-0577-4143-b337-f63a04d6a73d-srv-cert\") pod \"catalog-operator-68c6474976-qnflb\" (UID: \"4bebf51e-0577-4143-b337-f63a04d6a73d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701588 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-config-volume\") pod \"collect-profiles-29431605-hwwmj\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701604 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8b4426a-81fb-42cc-92ff-9f488d660820-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-drpzn\" (UID: \"f8b4426a-81fb-42cc-92ff-9f488d660820\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701682 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-serving-cert\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701698 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95388006-228d-47cb-ab64-42cea04840bc-config-volume\") pod \"dns-default-m9d45\" (UID: \"95388006-228d-47cb-ab64-42cea04840bc\") " pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701713 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-oauth-config\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701729 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-oauth-serving-cert\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701745 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-client-ca\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701760 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-images\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701775 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsgbg\" (UniqueName: \"kubernetes.io/projected/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-kube-api-access-lsgbg\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701790 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lq46\" (UniqueName: \"kubernetes.io/projected/60329838-5948-4dc3-8ff1-cf2951d99197-kube-api-access-2lq46\") pod \"machine-config-controller-84d6567774-rkm4q\" (UID: \"60329838-5948-4dc3-8ff1-cf2951d99197\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701808 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-config\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701845 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8k6z5\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701890 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b4426a-81fb-42cc-92ff-9f488d660820-config\") pod \"kube-controller-manager-operator-78b949d7b-drpzn\" (UID: \"f8b4426a-81fb-42cc-92ff-9f488d660820\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701908 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-tls\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701931 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.701959 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-proxy-tls\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.709203 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.762468 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.802686 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.802930 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkn7q\" (UniqueName: \"kubernetes.io/projected/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-kube-api-access-wkn7q\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.802985 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d652ee4-8157-4667-a521-c75855e26796-trusted-ca\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803054 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4723c39e-a85e-4c31-b938-6ace3fb5f700-certs\") pod \"machine-config-server-mf9mx\" (UID: \"4723c39e-a85e-4c31-b938-6ace3fb5f700\") " pod="openshift-machine-config-operator/machine-config-server-mf9mx" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803082 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e75ced0f-eb7d-4781-a34a-358c9e5db98a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nt6nd\" (UID: \"e75ced0f-eb7d-4781-a34a-358c9e5db98a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803105 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05c53e6d-0610-4944-a6af-1fdb84368f05-service-ca-bundle\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803121 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3cea0eff-ca09-4d09-9f20-485c9ef6003b-webhook-cert\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803157 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-service-ca\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803173 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d652ee4-8157-4667-a521-c75855e26796-metrics-tls\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803189 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3cea0eff-ca09-4d09-9f20-485c9ef6003b-apiservice-cert\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803214 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mmpt\" (UniqueName: \"kubernetes.io/projected/36b07558-0136-4600-86f4-d20044b9910d-kube-api-access-7mmpt\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803238 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsbnl\" (UniqueName: \"kubernetes.io/projected/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-kube-api-access-xsbnl\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803256 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8k6z5\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803271 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8b4426a-81fb-42cc-92ff-9f488d660820-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-drpzn\" (UID: \"f8b4426a-81fb-42cc-92ff-9f488d660820\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803294 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-serving-cert\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803313 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l87z5\" (UniqueName: \"kubernetes.io/projected/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-kube-api-access-l87z5\") pod \"collect-profiles-29431605-hwwmj\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803336 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vjnr\" (UniqueName: \"kubernetes.io/projected/510146d5-b80f-404e-9aac-8e06a20a0c44-kube-api-access-6vjnr\") pod \"service-ca-operator-777779d784-g66rv\" (UID: \"510146d5-b80f-404e-9aac-8e06a20a0c44\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803352 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f6qb\" (UniqueName: \"kubernetes.io/projected/971f1003-73c5-40ca-8c3a-5479215b4e72-kube-api-access-7f6qb\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cvcc\" (UID: \"971f1003-73c5-40ca-8c3a-5479215b4e72\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803391 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700-metrics-tls\") pod \"dns-operator-744455d44c-4tpkr\" (UID: \"7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700\") " pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803431 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm7rt\" (UniqueName: \"kubernetes.io/projected/31f40573-11e0-49ad-adeb-d0f013b07696-kube-api-access-cm7rt\") pod \"ingress-canary-87x9r\" (UID: \"31f40573-11e0-49ad-adeb-d0f013b07696\") " pod="openshift-ingress-canary/ingress-canary-87x9r" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803457 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-trusted-ca\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803473 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803489 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-socket-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803506 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-images\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803531 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a5f01333-4573-4a05-b1cf-5dfdc95a33cd-signing-key\") pod \"service-ca-9c57cc56f-hw5mk\" (UID: \"a5f01333-4573-4a05-b1cf-5dfdc95a33cd\") " pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803547 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-secret-volume\") pod \"collect-profiles-29431605-hwwmj\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803561 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/05c53e6d-0610-4944-a6af-1fdb84368f05-stats-auth\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803577 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/693eed37-ae85-4a0b-a3e3-4e908245ac13-serving-cert\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803599 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-certificates\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803626 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971f1003-73c5-40ca-8c3a-5479215b4e72-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cvcc\" (UID: \"971f1003-73c5-40ca-8c3a-5479215b4e72\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803658 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05c53e6d-0610-4944-a6af-1fdb84368f05-metrics-certs\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803674 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm89x\" (UniqueName: \"kubernetes.io/projected/ca945ba8-363c-4e60-b11a-6938e4cb9354-kube-api-access-dm89x\") pod \"marketplace-operator-79b997595-8k6z5\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803697 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-config\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803712 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/269fe7e0-633b-41d4-8a8f-cd39424229e4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803727 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpmlk\" (UniqueName: \"kubernetes.io/projected/95388006-228d-47cb-ab64-42cea04840bc-kube-api-access-kpmlk\") pod \"dns-default-m9d45\" (UID: \"95388006-228d-47cb-ab64-42cea04840bc\") " pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803744 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/510146d5-b80f-404e-9aac-8e06a20a0c44-config\") pod \"service-ca-operator-777779d784-g66rv\" (UID: \"510146d5-b80f-404e-9aac-8e06a20a0c44\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803759 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckd94\" (UniqueName: \"kubernetes.io/projected/f9b2dba4-0e5d-45df-9d96-08f13c21c06a-kube-api-access-ckd94\") pod \"package-server-manager-789f6589d5-68vs6\" (UID: \"f9b2dba4-0e5d-45df-9d96-08f13c21c06a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803776 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/693eed37-ae85-4a0b-a3e3-4e908245ac13-trusted-ca\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803790 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d652ee4-8157-4667-a521-c75855e26796-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803806 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-bound-sa-token\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803820 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/510146d5-b80f-404e-9aac-8e06a20a0c44-serving-cert\") pod \"service-ca-operator-777779d784-g66rv\" (UID: \"510146d5-b80f-404e-9aac-8e06a20a0c44\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803834 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a58bcc67-9370-45bf-a2b4-96dec3c76b3f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kqbmn\" (UID: \"a58bcc67-9370-45bf-a2b4-96dec3c76b3f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803849 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-registration-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803873 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwzvp\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-kube-api-access-pwzvp\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803889 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph49x\" (UniqueName: \"kubernetes.io/projected/a5f01333-4573-4a05-b1cf-5dfdc95a33cd-kube-api-access-ph49x\") pod \"service-ca-9c57cc56f-hw5mk\" (UID: \"a5f01333-4573-4a05-b1cf-5dfdc95a33cd\") " pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803903 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/95388006-228d-47cb-ab64-42cea04840bc-metrics-tls\") pod \"dns-default-m9d45\" (UID: \"95388006-228d-47cb-ab64-42cea04840bc\") " pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803919 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d9cb\" (UniqueName: \"kubernetes.io/projected/693eed37-ae85-4a0b-a3e3-4e908245ac13-kube-api-access-6d9cb\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803932 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31f40573-11e0-49ad-adeb-d0f013b07696-cert\") pod \"ingress-canary-87x9r\" (UID: \"31f40573-11e0-49ad-adeb-d0f013b07696\") " pod="openshift-ingress-canary/ingress-canary-87x9r" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803955 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4bebf51e-0577-4143-b337-f63a04d6a73d-srv-cert\") pod \"catalog-operator-68c6474976-qnflb\" (UID: \"4bebf51e-0577-4143-b337-f63a04d6a73d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803971 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-config-volume\") pod \"collect-profiles-29431605-hwwmj\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.803990 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8b4426a-81fb-42cc-92ff-9f488d660820-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-drpzn\" (UID: \"f8b4426a-81fb-42cc-92ff-9f488d660820\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804008 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvvw9\" (UniqueName: \"kubernetes.io/projected/14a40ff4-9558-428f-a784-c18c5d62d60a-kube-api-access-wvvw9\") pod \"control-plane-machine-set-operator-78cbb6b69f-zf5xv\" (UID: \"14a40ff4-9558-428f-a784-c18c5d62d60a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804031 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-serving-cert\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804053 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95388006-228d-47cb-ab64-42cea04840bc-config-volume\") pod \"dns-default-m9d45\" (UID: \"95388006-228d-47cb-ab64-42cea04840bc\") " pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804067 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-oauth-config\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804089 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-oauth-serving-cert\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804105 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-client-ca\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804119 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-images\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804133 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsgbg\" (UniqueName: \"kubernetes.io/projected/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-kube-api-access-lsgbg\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804190 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lq46\" (UniqueName: \"kubernetes.io/projected/60329838-5948-4dc3-8ff1-cf2951d99197-kube-api-access-2lq46\") pod \"machine-config-controller-84d6567774-rkm4q\" (UID: \"60329838-5948-4dc3-8ff1-cf2951d99197\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804211 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s568\" (UniqueName: \"kubernetes.io/projected/b3bb3df3-c0ee-450a-827f-ff91fe54d0db-kube-api-access-7s568\") pod \"olm-operator-6b444d44fb-n7d4h\" (UID: \"b3bb3df3-c0ee-450a-827f-ff91fe54d0db\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804237 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-config\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804284 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8k6z5\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804303 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/05c53e6d-0610-4944-a6af-1fdb84368f05-default-certificate\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804325 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-tls\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804366 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b4426a-81fb-42cc-92ff-9f488d660820-config\") pod \"kube-controller-manager-operator-78b949d7b-drpzn\" (UID: \"f8b4426a-81fb-42cc-92ff-9f488d660820\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804390 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4723c39e-a85e-4c31-b938-6ace3fb5f700-node-bootstrap-token\") pod \"machine-config-server-mf9mx\" (UID: \"4723c39e-a85e-4c31-b938-6ace3fb5f700\") " pod="openshift-machine-config-operator/machine-config-server-mf9mx" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804454 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804474 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-proxy-tls\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804489 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3cea0eff-ca09-4d09-9f20-485c9ef6003b-tmpfs\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804546 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4mbw\" (UniqueName: \"kubernetes.io/projected/a58bcc67-9370-45bf-a2b4-96dec3c76b3f-kube-api-access-f4mbw\") pod \"multus-admission-controller-857f4d67dd-kqbmn\" (UID: \"a58bcc67-9370-45bf-a2b4-96dec3c76b3f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804565 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-trusted-ca-bundle\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804603 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60329838-5948-4dc3-8ff1-cf2951d99197-proxy-tls\") pod \"machine-config-controller-84d6567774-rkm4q\" (UID: \"60329838-5948-4dc3-8ff1-cf2951d99197\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804620 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxskk\" (UniqueName: \"kubernetes.io/projected/3cea0eff-ca09-4d09-9f20-485c9ef6003b-kube-api-access-qxskk\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804636 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kmxj\" (UniqueName: \"kubernetes.io/projected/4723c39e-a85e-4c31-b938-6ace3fb5f700-kube-api-access-7kmxj\") pod \"machine-config-server-mf9mx\" (UID: \"4723c39e-a85e-4c31-b938-6ace3fb5f700\") " pod="openshift-machine-config-operator/machine-config-server-mf9mx" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804650 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-plugins-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804688 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfd7m\" (UniqueName: \"kubernetes.io/projected/4bebf51e-0577-4143-b337-f63a04d6a73d-kube-api-access-jfd7m\") pod \"catalog-operator-68c6474976-qnflb\" (UID: \"4bebf51e-0577-4143-b337-f63a04d6a73d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804706 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a5f01333-4573-4a05-b1cf-5dfdc95a33cd-signing-cabundle\") pod \"service-ca-9c57cc56f-hw5mk\" (UID: \"a5f01333-4573-4a05-b1cf-5dfdc95a33cd\") " pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804726 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b2dba4-0e5d-45df-9d96-08f13c21c06a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-68vs6\" (UID: \"f9b2dba4-0e5d-45df-9d96-08f13c21c06a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804747 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blhj5\" (UniqueName: \"kubernetes.io/projected/7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700-kube-api-access-blhj5\") pod \"dns-operator-744455d44c-4tpkr\" (UID: \"7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700\") " pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804782 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b3bb3df3-c0ee-450a-827f-ff91fe54d0db-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n7d4h\" (UID: \"b3bb3df3-c0ee-450a-827f-ff91fe54d0db\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804812 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4bebf51e-0577-4143-b337-f63a04d6a73d-profile-collector-cert\") pod \"catalog-operator-68c6474976-qnflb\" (UID: \"4bebf51e-0577-4143-b337-f63a04d6a73d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804830 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzvzd\" (UniqueName: \"kubernetes.io/projected/db457bae-59bc-4ec6-b5dd-8699c5794f76-kube-api-access-tzvzd\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804866 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/60329838-5948-4dc3-8ff1-cf2951d99197-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rkm4q\" (UID: \"60329838-5948-4dc3-8ff1-cf2951d99197\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804883 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/269fe7e0-633b-41d4-8a8f-cd39424229e4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804898 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75ced0f-eb7d-4781-a34a-358c9e5db98a-config\") pod \"kube-apiserver-operator-766d6c64bb-nt6nd\" (UID: \"e75ced0f-eb7d-4781-a34a-358c9e5db98a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804940 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-config\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804960 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/14a40ff4-9558-428f-a784-c18c5d62d60a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zf5xv\" (UID: \"14a40ff4-9558-428f-a784-c18c5d62d60a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804976 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-mountpoint-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.804993 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-csi-data-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.805043 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971f1003-73c5-40ca-8c3a-5479215b4e72-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cvcc\" (UID: \"971f1003-73c5-40ca-8c3a-5479215b4e72\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.805058 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/693eed37-ae85-4a0b-a3e3-4e908245ac13-config\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.805073 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b3bb3df3-c0ee-450a-827f-ff91fe54d0db-srv-cert\") pod \"olm-operator-6b444d44fb-n7d4h\" (UID: \"b3bb3df3-c0ee-450a-827f-ff91fe54d0db\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.805114 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.805135 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75ced0f-eb7d-4781-a34a-358c9e5db98a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nt6nd\" (UID: \"e75ced0f-eb7d-4781-a34a-358c9e5db98a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.805165 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nwd4\" (UniqueName: \"kubernetes.io/projected/7d652ee4-8157-4667-a521-c75855e26796-kube-api-access-7nwd4\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.805214 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhqw6\" (UniqueName: \"kubernetes.io/projected/05c53e6d-0610-4944-a6af-1fdb84368f05-kube-api-access-xhqw6\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.805913 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/510146d5-b80f-404e-9aac-8e06a20a0c44-config\") pod \"service-ca-operator-777779d784-g66rv\" (UID: \"510146d5-b80f-404e-9aac-8e06a20a0c44\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" Dec 16 14:59:24 crc kubenswrapper[4728]: E1216 14:59:24.806031 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:25.306010409 +0000 UTC m=+146.146189473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.806544 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/693eed37-ae85-4a0b-a3e3-4e908245ac13-trusted-ca\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.809159 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-config\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.811873 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-client-ca\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.812319 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b4426a-81fb-42cc-92ff-9f488d660820-config\") pod \"kube-controller-manager-operator-78b949d7b-drpzn\" (UID: \"f8b4426a-81fb-42cc-92ff-9f488d660820\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.813324 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-service-ca\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.813495 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700-metrics-tls\") pod \"dns-operator-744455d44c-4tpkr\" (UID: \"7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700\") " pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.813862 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8k6z5\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.814966 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-certificates\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.817402 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a58bcc67-9370-45bf-a2b4-96dec3c76b3f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kqbmn\" (UID: \"a58bcc67-9370-45bf-a2b4-96dec3c76b3f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.818615 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9"] Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.818624 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-config\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.818933 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-oauth-config\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.819222 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8b4426a-81fb-42cc-92ff-9f488d660820-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-drpzn\" (UID: \"f8b4426a-81fb-42cc-92ff-9f488d660820\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.819233 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-oauth-serving-cert\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.819393 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/693eed37-ae85-4a0b-a3e3-4e908245ac13-serving-cert\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.819685 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/971f1003-73c5-40ca-8c3a-5479215b4e72-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cvcc\" (UID: \"971f1003-73c5-40ca-8c3a-5479215b4e72\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.820189 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.820204 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a5f01333-4573-4a05-b1cf-5dfdc95a33cd-signing-key\") pod \"service-ca-9c57cc56f-hw5mk\" (UID: \"a5f01333-4573-4a05-b1cf-5dfdc95a33cd\") " pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.821999 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/269fe7e0-633b-41d4-8a8f-cd39424229e4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.822073 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-trusted-ca-bundle\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.822468 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4bebf51e-0577-4143-b337-f63a04d6a73d-srv-cert\") pod \"catalog-operator-68c6474976-qnflb\" (UID: \"4bebf51e-0577-4143-b337-f63a04d6a73d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.822933 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-proxy-tls\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.823264 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d652ee4-8157-4667-a521-c75855e26796-trusted-ca\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.823716 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a5f01333-4573-4a05-b1cf-5dfdc95a33cd-signing-cabundle\") pod \"service-ca-9c57cc56f-hw5mk\" (UID: \"a5f01333-4573-4a05-b1cf-5dfdc95a33cd\") " pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.824521 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95388006-228d-47cb-ab64-42cea04840bc-config-volume\") pod \"dns-default-m9d45\" (UID: \"95388006-228d-47cb-ab64-42cea04840bc\") " pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.825000 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-config-volume\") pod \"collect-profiles-29431605-hwwmj\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.825161 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60329838-5948-4dc3-8ff1-cf2951d99197-proxy-tls\") pod \"machine-config-controller-84d6567774-rkm4q\" (UID: \"60329838-5948-4dc3-8ff1-cf2951d99197\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.825502 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-images\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.825673 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/60329838-5948-4dc3-8ff1-cf2951d99197-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rkm4q\" (UID: \"60329838-5948-4dc3-8ff1-cf2951d99197\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.826228 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d652ee4-8157-4667-a521-c75855e26796-metrics-tls\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.826280 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.826988 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/693eed37-ae85-4a0b-a3e3-4e908245ac13-config\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.826989 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e75ced0f-eb7d-4781-a34a-358c9e5db98a-config\") pod \"kube-apiserver-operator-766d6c64bb-nt6nd\" (UID: \"e75ced0f-eb7d-4781-a34a-358c9e5db98a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.827055 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971f1003-73c5-40ca-8c3a-5479215b4e72-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cvcc\" (UID: \"971f1003-73c5-40ca-8c3a-5479215b4e72\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.827342 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-config\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.827631 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/269fe7e0-633b-41d4-8a8f-cd39424229e4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.827826 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-images\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.827957 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/510146d5-b80f-404e-9aac-8e06a20a0c44-serving-cert\") pod \"service-ca-operator-777779d784-g66rv\" (UID: \"510146d5-b80f-404e-9aac-8e06a20a0c44\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.827960 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-tls\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.828476 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4bebf51e-0577-4143-b337-f63a04d6a73d-profile-collector-cert\") pod \"catalog-operator-68c6474976-qnflb\" (UID: \"4bebf51e-0577-4143-b337-f63a04d6a73d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.828993 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b2dba4-0e5d-45df-9d96-08f13c21c06a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-68vs6\" (UID: \"f9b2dba4-0e5d-45df-9d96-08f13c21c06a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.828997 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-serving-cert\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.829282 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-trusted-ca\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.829755 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-serving-cert\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.830517 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e75ced0f-eb7d-4781-a34a-358c9e5db98a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nt6nd\" (UID: \"e75ced0f-eb7d-4781-a34a-358c9e5db98a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.830910 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8k6z5\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.834273 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.834534 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-secret-volume\") pod \"collect-profiles-29431605-hwwmj\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.835834 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/95388006-228d-47cb-ab64-42cea04840bc-metrics-tls\") pod \"dns-default-m9d45\" (UID: \"95388006-228d-47cb-ab64-42cea04840bc\") " pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.846321 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckd94\" (UniqueName: \"kubernetes.io/projected/f9b2dba4-0e5d-45df-9d96-08f13c21c06a-kube-api-access-ckd94\") pod \"package-server-manager-789f6589d5-68vs6\" (UID: \"f9b2dba4-0e5d-45df-9d96-08f13c21c06a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.872202 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l87z5\" (UniqueName: \"kubernetes.io/projected/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-kube-api-access-l87z5\") pod \"collect-profiles-29431605-hwwmj\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.887886 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p"] Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.896741 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkn7q\" (UniqueName: \"kubernetes.io/projected/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-kube-api-access-wkn7q\") pod \"controller-manager-879f6c89f-hnrqv\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906095 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4723c39e-a85e-4c31-b938-6ace3fb5f700-certs\") pod \"machine-config-server-mf9mx\" (UID: \"4723c39e-a85e-4c31-b938-6ace3fb5f700\") " pod="openshift-machine-config-operator/machine-config-server-mf9mx" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906136 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3cea0eff-ca09-4d09-9f20-485c9ef6003b-webhook-cert\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906155 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05c53e6d-0610-4944-a6af-1fdb84368f05-service-ca-bundle\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906171 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3cea0eff-ca09-4d09-9f20-485c9ef6003b-apiservice-cert\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906187 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mmpt\" (UniqueName: \"kubernetes.io/projected/36b07558-0136-4600-86f4-d20044b9910d-kube-api-access-7mmpt\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906224 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm7rt\" (UniqueName: \"kubernetes.io/projected/31f40573-11e0-49ad-adeb-d0f013b07696-kube-api-access-cm7rt\") pod \"ingress-canary-87x9r\" (UID: \"31f40573-11e0-49ad-adeb-d0f013b07696\") " pod="openshift-ingress-canary/ingress-canary-87x9r" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906248 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-socket-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906266 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/05c53e6d-0610-4944-a6af-1fdb84368f05-stats-auth\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906289 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05c53e6d-0610-4944-a6af-1fdb84368f05-metrics-certs\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906354 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-registration-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906395 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31f40573-11e0-49ad-adeb-d0f013b07696-cert\") pod \"ingress-canary-87x9r\" (UID: \"31f40573-11e0-49ad-adeb-d0f013b07696\") " pod="openshift-ingress-canary/ingress-canary-87x9r" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906425 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906634 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvvw9\" (UniqueName: \"kubernetes.io/projected/14a40ff4-9558-428f-a784-c18c5d62d60a-kube-api-access-wvvw9\") pod \"control-plane-machine-set-operator-78cbb6b69f-zf5xv\" (UID: \"14a40ff4-9558-428f-a784-c18c5d62d60a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906687 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s568\" (UniqueName: \"kubernetes.io/projected/b3bb3df3-c0ee-450a-827f-ff91fe54d0db-kube-api-access-7s568\") pod \"olm-operator-6b444d44fb-n7d4h\" (UID: \"b3bb3df3-c0ee-450a-827f-ff91fe54d0db\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906713 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/05c53e6d-0610-4944-a6af-1fdb84368f05-default-certificate\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906833 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4723c39e-a85e-4c31-b938-6ace3fb5f700-node-bootstrap-token\") pod \"machine-config-server-mf9mx\" (UID: \"4723c39e-a85e-4c31-b938-6ace3fb5f700\") " pod="openshift-machine-config-operator/machine-config-server-mf9mx" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906859 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3cea0eff-ca09-4d09-9f20-485c9ef6003b-tmpfs\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906896 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kmxj\" (UniqueName: \"kubernetes.io/projected/4723c39e-a85e-4c31-b938-6ace3fb5f700-kube-api-access-7kmxj\") pod \"machine-config-server-mf9mx\" (UID: \"4723c39e-a85e-4c31-b938-6ace3fb5f700\") " pod="openshift-machine-config-operator/machine-config-server-mf9mx" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.906996 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-plugins-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.907014 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxskk\" (UniqueName: \"kubernetes.io/projected/3cea0eff-ca09-4d09-9f20-485c9ef6003b-kube-api-access-qxskk\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.907038 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b3bb3df3-c0ee-450a-827f-ff91fe54d0db-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n7d4h\" (UID: \"b3bb3df3-c0ee-450a-827f-ff91fe54d0db\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.907147 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/14a40ff4-9558-428f-a784-c18c5d62d60a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zf5xv\" (UID: \"14a40ff4-9558-428f-a784-c18c5d62d60a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.907171 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-mountpoint-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.907187 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-csi-data-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.907202 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b3bb3df3-c0ee-450a-827f-ff91fe54d0db-srv-cert\") pod \"olm-operator-6b444d44fb-n7d4h\" (UID: \"b3bb3df3-c0ee-450a-827f-ff91fe54d0db\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.907227 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhqw6\" (UniqueName: \"kubernetes.io/projected/05c53e6d-0610-4944-a6af-1fdb84368f05-kube-api-access-xhqw6\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.911101 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-plugins-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.911404 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3cea0eff-ca09-4d09-9f20-485c9ef6003b-tmpfs\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.912650 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-csi-data-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.912686 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-mountpoint-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.912992 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-socket-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.913103 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/36b07558-0136-4600-86f4-d20044b9910d-registration-dir\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.913701 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05c53e6d-0610-4944-a6af-1fdb84368f05-service-ca-bundle\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.913951 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4723c39e-a85e-4c31-b938-6ace3fb5f700-certs\") pod \"machine-config-server-mf9mx\" (UID: \"4723c39e-a85e-4c31-b938-6ace3fb5f700\") " pod="openshift-machine-config-operator/machine-config-server-mf9mx" Dec 16 14:59:24 crc kubenswrapper[4728]: E1216 14:59:24.914712 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:25.414698012 +0000 UTC m=+146.254876996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.918621 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.919882 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3cea0eff-ca09-4d09-9f20-485c9ef6003b-apiservice-cert\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.920091 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/05c53e6d-0610-4944-a6af-1fdb84368f05-default-certificate\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.924957 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.926019 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b3bb3df3-c0ee-450a-827f-ff91fe54d0db-srv-cert\") pod \"olm-operator-6b444d44fb-n7d4h\" (UID: \"b3bb3df3-c0ee-450a-827f-ff91fe54d0db\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.932122 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05c53e6d-0610-4944-a6af-1fdb84368f05-metrics-certs\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.932274 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3cea0eff-ca09-4d09-9f20-485c9ef6003b-webhook-cert\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.932903 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31f40573-11e0-49ad-adeb-d0f013b07696-cert\") pod \"ingress-canary-87x9r\" (UID: \"31f40573-11e0-49ad-adeb-d0f013b07696\") " pod="openshift-ingress-canary/ingress-canary-87x9r" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.932942 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs"] Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.933930 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x6b6q"] Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.934552 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d652ee4-8157-4667-a521-c75855e26796-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.944188 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4723c39e-a85e-4c31-b938-6ace3fb5f700-node-bootstrap-token\") pod \"machine-config-server-mf9mx\" (UID: \"4723c39e-a85e-4c31-b938-6ace3fb5f700\") " pod="openshift-machine-config-operator/machine-config-server-mf9mx" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.948925 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b3bb3df3-c0ee-450a-827f-ff91fe54d0db-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n7d4h\" (UID: \"b3bb3df3-c0ee-450a-827f-ff91fe54d0db\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.949010 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/05c53e6d-0610-4944-a6af-1fdb84368f05-stats-auth\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.949835 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-bound-sa-token\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.950169 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/14a40ff4-9558-428f-a784-c18c5d62d60a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zf5xv\" (UID: \"14a40ff4-9558-428f-a784-c18c5d62d60a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.950834 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vjnr\" (UniqueName: \"kubernetes.io/projected/510146d5-b80f-404e-9aac-8e06a20a0c44-kube-api-access-6vjnr\") pod \"service-ca-operator-777779d784-g66rv\" (UID: \"510146d5-b80f-404e-9aac-8e06a20a0c44\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.964846 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f6qb\" (UniqueName: \"kubernetes.io/projected/971f1003-73c5-40ca-8c3a-5479215b4e72-kube-api-access-7f6qb\") pod \"kube-storage-version-migrator-operator-b67b599dd-8cvcc\" (UID: \"971f1003-73c5-40ca-8c3a-5479215b4e72\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" Dec 16 14:59:24 crc kubenswrapper[4728]: W1216 14:59:24.976372 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc809652e_9ba8_4499_b2c5_4a0015048ea0.slice/crio-90e4dbbf7494ffe4cc41d9bdb8aaf4588a017016626da51a808a9de5294952eb WatchSource:0}: Error finding container 90e4dbbf7494ffe4cc41d9bdb8aaf4588a017016626da51a808a9de5294952eb: Status 404 returned error can't find the container with id 90e4dbbf7494ffe4cc41d9bdb8aaf4588a017016626da51a808a9de5294952eb Dec 16 14:59:24 crc kubenswrapper[4728]: W1216 14:59:24.977802 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2cfce66_2c6c_41ad_84f6_6726c3ea088d.slice/crio-42ab6db6d2aefdcb97960ca0b1d658d85bd032cfca78a72ff61053c5d7bdb4a9 WatchSource:0}: Error finding container 42ab6db6d2aefdcb97960ca0b1d658d85bd032cfca78a72ff61053c5d7bdb4a9: Status 404 returned error can't find the container with id 42ab6db6d2aefdcb97960ca0b1d658d85bd032cfca78a72ff61053c5d7bdb4a9 Dec 16 14:59:24 crc kubenswrapper[4728]: W1216 14:59:24.979774 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a82fd60_88d6_4e1a_b140_bf2f92ea0a87.slice/crio-64e1df958d5f894cd2cec30674cdf196b37ba9fa13492a5c1148912de02452a9 WatchSource:0}: Error finding container 64e1df958d5f894cd2cec30674cdf196b37ba9fa13492a5c1148912de02452a9: Status 404 returned error can't find the container with id 64e1df958d5f894cd2cec30674cdf196b37ba9fa13492a5c1148912de02452a9 Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.991279 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsgbg\" (UniqueName: \"kubernetes.io/projected/4dc26d87-3ce0-4071-bf63-07cbe9e19d6c-kube-api-access-lsgbg\") pod \"machine-config-operator-74547568cd-9mv8j\" (UID: \"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:24 crc kubenswrapper[4728]: I1216 14:59:24.995756 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.002615 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.007698 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwzvp\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-kube-api-access-pwzvp\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.007940 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.008109 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:25.5080835 +0000 UTC m=+146.348262484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.008353 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.008719 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:25.508707355 +0000 UTC m=+146.348886339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.017469 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.028606 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lq46\" (UniqueName: \"kubernetes.io/projected/60329838-5948-4dc3-8ff1-cf2951d99197-kube-api-access-2lq46\") pod \"machine-config-controller-84d6567774-rkm4q\" (UID: \"60329838-5948-4dc3-8ff1-cf2951d99197\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.040956 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.054255 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph49x\" (UniqueName: \"kubernetes.io/projected/a5f01333-4573-4a05-b1cf-5dfdc95a33cd-kube-api-access-ph49x\") pod \"service-ca-9c57cc56f-hw5mk\" (UID: \"a5f01333-4573-4a05-b1cf-5dfdc95a33cd\") " pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.057506 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x"] Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.068803 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e75ced0f-eb7d-4781-a34a-358c9e5db98a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nt6nd\" (UID: \"e75ced0f-eb7d-4781-a34a-358c9e5db98a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.076242 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.083138 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.111064 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.111640 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:25.611624464 +0000 UTC m=+146.451803448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.127128 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d9cb\" (UniqueName: \"kubernetes.io/projected/693eed37-ae85-4a0b-a3e3-4e908245ac13-kube-api-access-6d9cb\") pod \"console-operator-58897d9998-phlkm\" (UID: \"693eed37-ae85-4a0b-a3e3-4e908245ac13\") " pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.131465 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsbnl\" (UniqueName: \"kubernetes.io/projected/6ef09dcb-9a41-4fb0-8492-cdd81b0222fe-kube-api-access-xsbnl\") pod \"machine-api-operator-5694c8668f-pfz7w\" (UID: \"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:25 crc kubenswrapper[4728]: W1216 14:59:25.148492 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44dea519_de7c_4bc5_934e_5c251177b6fd.slice/crio-eb07189bdb8fe33bcf76d10620207398e481d7ebea1d990985c40b54a7a4a31d WatchSource:0}: Error finding container eb07189bdb8fe33bcf76d10620207398e481d7ebea1d990985c40b54a7a4a31d: Status 404 returned error can't find the container with id eb07189bdb8fe33bcf76d10620207398e481d7ebea1d990985c40b54a7a4a31d Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.150813 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm89x\" (UniqueName: \"kubernetes.io/projected/ca945ba8-363c-4e60-b11a-6938e4cb9354-kube-api-access-dm89x\") pod \"marketplace-operator-79b997595-8k6z5\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.167610 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4mbw\" (UniqueName: \"kubernetes.io/projected/a58bcc67-9370-45bf-a2b4-96dec3c76b3f-kube-api-access-f4mbw\") pod \"multus-admission-controller-857f4d67dd-kqbmn\" (UID: \"a58bcc67-9370-45bf-a2b4-96dec3c76b3f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.191721 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfd7m\" (UniqueName: \"kubernetes.io/projected/4bebf51e-0577-4143-b337-f63a04d6a73d-kube-api-access-jfd7m\") pod \"catalog-operator-68c6474976-qnflb\" (UID: \"4bebf51e-0577-4143-b337-f63a04d6a73d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.198732 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl"] Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.208922 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blhj5\" (UniqueName: \"kubernetes.io/projected/7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700-kube-api-access-blhj5\") pod \"dns-operator-744455d44c-4tpkr\" (UID: \"7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700\") " pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.212399 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.212748 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:25.712735916 +0000 UTC m=+146.552914900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.228058 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8b4426a-81fb-42cc-92ff-9f488d660820-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-drpzn\" (UID: \"f8b4426a-81fb-42cc-92ff-9f488d660820\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.232246 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.245193 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hnrqv"] Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.248303 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzvzd\" (UniqueName: \"kubernetes.io/projected/db457bae-59bc-4ec6-b5dd-8699c5794f76-kube-api-access-tzvzd\") pod \"console-f9d7485db-pr5wl\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.263260 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.266290 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.274101 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nwd4\" (UniqueName: \"kubernetes.io/projected/7d652ee4-8157-4667-a521-c75855e26796-kube-api-access-7nwd4\") pod \"ingress-operator-5b745b69d9-7r8ts\" (UID: \"7d652ee4-8157-4667-a521-c75855e26796\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.280518 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.285804 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.292447 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpmlk\" (UniqueName: \"kubernetes.io/projected/95388006-228d-47cb-ab64-42cea04840bc-kube-api-access-kpmlk\") pod \"dns-default-m9d45\" (UID: \"95388006-228d-47cb-ab64-42cea04840bc\") " pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.304095 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6"] Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.308816 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhqw6\" (UniqueName: \"kubernetes.io/projected/05c53e6d-0610-4944-a6af-1fdb84368f05-kube-api-access-xhqw6\") pod \"router-default-5444994796-55pkk\" (UID: \"05c53e6d-0610-4944-a6af-1fdb84368f05\") " pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.314257 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.316730 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:25.816712651 +0000 UTC m=+146.656891635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: W1216 14:59:25.318637 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe87da2c_8dce_4856_9fff_fdb78f7c4dcf.slice/crio-52040c599a95bb3a936989fab87ff71c5419f13ddb3d02f5806cc8fde247aba2 WatchSource:0}: Error finding container 52040c599a95bb3a936989fab87ff71c5419f13ddb3d02f5806cc8fde247aba2: Status 404 returned error can't find the container with id 52040c599a95bb3a936989fab87ff71c5419f13ddb3d02f5806cc8fde247aba2 Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.318824 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.319291 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:25.819282535 +0000 UTC m=+146.659461519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.324760 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.331883 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvvw9\" (UniqueName: \"kubernetes.io/projected/14a40ff4-9558-428f-a784-c18c5d62d60a-kube-api-access-wvvw9\") pod \"control-plane-machine-set-operator-78cbb6b69f-zf5xv\" (UID: \"14a40ff4-9558-428f-a784-c18c5d62d60a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.332096 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.347946 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.357754 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.357918 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kmxj\" (UniqueName: \"kubernetes.io/projected/4723c39e-a85e-4c31-b938-6ace3fb5f700-kube-api-access-7kmxj\") pod \"machine-config-server-mf9mx\" (UID: \"4723c39e-a85e-4c31-b938-6ace3fb5f700\") " pod="openshift-machine-config-operator/machine-config-server-mf9mx" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.363956 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ggsk6"] Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.368254 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.371060 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s568\" (UniqueName: \"kubernetes.io/projected/b3bb3df3-c0ee-450a-827f-ff91fe54d0db-kube-api-access-7s568\") pod \"olm-operator-6b444d44fb-n7d4h\" (UID: \"b3bb3df3-c0ee-450a-827f-ff91fe54d0db\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.380241 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j"] Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.391925 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxskk\" (UniqueName: \"kubernetes.io/projected/3cea0eff-ca09-4d09-9f20-485c9ef6003b-kube-api-access-qxskk\") pod \"packageserver-d55dfcdfc-z6qqt\" (UID: \"3cea0eff-ca09-4d09-9f20-485c9ef6003b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.399927 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.406119 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.407866 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mmpt\" (UniqueName: \"kubernetes.io/projected/36b07558-0136-4600-86f4-d20044b9910d-kube-api-access-7mmpt\") pod \"csi-hostpathplugin-5t99c\" (UID: \"36b07558-0136-4600-86f4-d20044b9910d\") " pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.413392 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.420098 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.420468 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:25.920450309 +0000 UTC m=+146.760629293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.421078 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.429328 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.432340 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" event={"ID":"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7","Type":"ContainerStarted","Data":"65acdf1b665989342cb30ee6a2c069602a47a2bbb8fabdfe95f9846ae12899a9"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.432987 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm7rt\" (UniqueName: \"kubernetes.io/projected/31f40573-11e0-49ad-adeb-d0f013b07696-kube-api-access-cm7rt\") pod \"ingress-canary-87x9r\" (UID: \"31f40573-11e0-49ad-adeb-d0f013b07696\") " pod="openshift-ingress-canary/ingress-canary-87x9r" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.451813 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5t99c" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.459569 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-87x9r" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.460713 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" event={"ID":"a752efc5-d365-4774-a134-f2199e58d26e","Type":"ContainerStarted","Data":"8ade47454241161e3322d901b73af8adc6695a7dc6795102dcdbc0007bb56125"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.460756 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" event={"ID":"a752efc5-d365-4774-a134-f2199e58d26e","Type":"ContainerStarted","Data":"7918f7d24f62b1b2fcabf1181e37c7eeecbbb7ae47e6f72875ccdbf5d125769f"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.461857 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" event={"ID":"f2cfce66-2c6c-41ad-84f6-6726c3ea088d","Type":"ContainerStarted","Data":"42ab6db6d2aefdcb97960ca0b1d658d85bd032cfca78a72ff61053c5d7bdb4a9"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.464550 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mf9mx" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.464982 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x6b6q" event={"ID":"7a82fd60-88d6-4e1a-b140-bf2f92ea0a87","Type":"ContainerStarted","Data":"64e1df958d5f894cd2cec30674cdf196b37ba9fa13492a5c1148912de02452a9"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.474320 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc"] Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.481341 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" event={"ID":"eeff725e-9dab-4bec-99f6-8105af9b3b6c","Type":"ContainerStarted","Data":"21dcef68631d91aba594e87a446fa92562917775907ea91526f24b0d87c170ec"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.481380 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" event={"ID":"eeff725e-9dab-4bec-99f6-8105af9b3b6c","Type":"ContainerStarted","Data":"7a130af95c875f838ba20efd5b85bc31edf44d9f3c3be4e2ad2cd49be0bf6996"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.482316 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.494505 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" event={"ID":"4d1c9711-c6ae-4b4a-bafb-07891dd80514","Type":"ContainerStarted","Data":"5de4ffb63f8fef43016a5b2cdfea90a1669aa42f877c787617fff78a1b644d43"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.521392 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.524974 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.024961267 +0000 UTC m=+146.865140251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: W1216 14:59:25.532244 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ee42482_d554_4944_8fce_e503c10c0ec9.slice/crio-88bd678c64e2e6bc4693e85ef9609db43817ca1c0f9dc64ac5c238844a1797c1 WatchSource:0}: Error finding container 88bd678c64e2e6bc4693e85ef9609db43817ca1c0f9dc64ac5c238844a1797c1: Status 404 returned error can't find the container with id 88bd678c64e2e6bc4693e85ef9609db43817ca1c0f9dc64ac5c238844a1797c1 Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.536653 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.537832 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" event={"ID":"2555bf9d-eb8f-4c07-b78a-8aec21d16b75","Type":"ContainerStarted","Data":"882c237e90ea1facfabc7e0d0e3a15400c3cd5ba9445b0b94e0df82d3e1a903a"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.537866 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj"] Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.537879 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" event={"ID":"2555bf9d-eb8f-4c07-b78a-8aec21d16b75","Type":"ContainerStarted","Data":"af3907fb59a249bceefca1f6bb2d89bbd71e11b94a029843613d620382976431"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.538238 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.543092 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" event={"ID":"179db76a-61fb-4ae3-a494-be67c44c7d65","Type":"ContainerStarted","Data":"67190be539efc7e678e0d92c900222d60762aef52757c73e35dbded63fb1c7a4"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.560176 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" event={"ID":"c809652e-9ba8-4499-b2c5-4a0015048ea0","Type":"ContainerStarted","Data":"90e4dbbf7494ffe4cc41d9bdb8aaf4588a017016626da51a808a9de5294952eb"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.563344 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.577225 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g66rv"] Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.591020 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" event={"ID":"06463dfe-cbd3-45df-847a-732f66305f9a","Type":"ContainerStarted","Data":"768c75558d9afd002a9355d8b2df1c4628848d18af245522f88d080d51fc4086"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.591091 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" event={"ID":"06463dfe-cbd3-45df-847a-732f66305f9a","Type":"ContainerStarted","Data":"1c8313f751087106dc4d6efbd9091f7175e991b569e814b5626e4f5b85a9e8bc"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.598000 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" event={"ID":"83befb95-6263-4eb5-85f9-7d061e73a0f4","Type":"ContainerStarted","Data":"cbb442d608bd8b62753f7298cd33b582d1e76ef3fd0d365867e4773f7215d1a6"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.598046 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" event={"ID":"83befb95-6263-4eb5-85f9-7d061e73a0f4","Type":"ContainerStarted","Data":"b23822adf8118f086b351b98a648862c8562c9abd4e82db9408c6cbdca849316"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.602472 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x" event={"ID":"44dea519-de7c-4bc5-934e-5c251177b6fd","Type":"ContainerStarted","Data":"eb07189bdb8fe33bcf76d10620207398e481d7ebea1d990985c40b54a7a4a31d"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.604581 4728 generic.go:334] "Generic (PLEG): container finished" podID="b7a24aa5-4c07-4f1f-984a-c3e4a73231e4" containerID="9a0f576691a3da163910def5ca7784aac6296560479b70335fc6c65bb48dd8ee" exitCode=0 Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.604626 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" event={"ID":"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4","Type":"ContainerDied","Data":"9a0f576691a3da163910def5ca7784aac6296560479b70335fc6c65bb48dd8ee"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.604646 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" event={"ID":"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4","Type":"ContainerStarted","Data":"05c06b5996cf3006480d18d2162418b2f2b2a37d6ef42d42595d005960b4c05a"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.606287 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" event={"ID":"2344bb62-ca23-4655-80b7-04fd2f766b9f","Type":"ContainerStarted","Data":"df1dc05bc7a66e5dceaf3a25432a6f7760dfaca1a18d3d061ebfbe8343dd2cd8"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.606311 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" event={"ID":"2344bb62-ca23-4655-80b7-04fd2f766b9f","Type":"ContainerStarted","Data":"63845ac1a414cff2c654d387622557dd654b8f18bfa6b2691854a0326dbf2c30"} Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.622589 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.622850 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.122827858 +0000 UTC m=+146.963006842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.623144 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.623947 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.123932846 +0000 UTC m=+146.964111830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.678177 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q"] Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.724140 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.724576 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.224546096 +0000 UTC m=+147.064725080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.826183 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.826543 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.326531061 +0000 UTC m=+147.166710045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.833188 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 14:59:25 crc kubenswrapper[4728]: I1216 14:59:25.931271 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:25 crc kubenswrapper[4728]: E1216 14:59:25.931757 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.431742037 +0000 UTC m=+147.271921021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:25 crc kubenswrapper[4728]: W1216 14:59:25.982360 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05c53e6d_0610_4944_a6af_1fdb84368f05.slice/crio-7cc1d42162612b16c37a4beb9d249a4c22d6c902320ba645b580340ef7ead88c WatchSource:0}: Error finding container 7cc1d42162612b16c37a4beb9d249a4c22d6c902320ba645b580340ef7ead88c: Status 404 returned error can't find the container with id 7cc1d42162612b16c37a4beb9d249a4c22d6c902320ba645b580340ef7ead88c Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.033636 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:26 crc kubenswrapper[4728]: E1216 14:59:26.034067 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.534051459 +0000 UTC m=+147.374230443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:26 crc kubenswrapper[4728]: W1216 14:59:26.092322 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4723c39e_a85e_4c31_b938_6ace3fb5f700.slice/crio-be2bc75f48c0d22f0d5f9d70663093816eb0ddfb3f19c412d7e526296072a089 WatchSource:0}: Error finding container be2bc75f48c0d22f0d5f9d70663093816eb0ddfb3f19c412d7e526296072a089: Status 404 returned error can't find the container with id be2bc75f48c0d22f0d5f9d70663093816eb0ddfb3f19c412d7e526296072a089 Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.135917 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:26 crc kubenswrapper[4728]: E1216 14:59:26.136201 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.636186178 +0000 UTC m=+147.476365152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.239523 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:26 crc kubenswrapper[4728]: E1216 14:59:26.239862 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.739837674 +0000 UTC m=+147.580016648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.266625 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" podStartSLOduration=123.266605247 podStartE2EDuration="2m3.266605247s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:26.26590869 +0000 UTC m=+147.106087674" watchObservedRunningTime="2025-12-16 14:59:26.266605247 +0000 UTC m=+147.106784231" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.340772 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:26 crc kubenswrapper[4728]: E1216 14:59:26.340962 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.840928566 +0000 UTC m=+147.681107550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.341126 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:26 crc kubenswrapper[4728]: E1216 14:59:26.341479 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.841463469 +0000 UTC m=+147.681642453 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.442122 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:26 crc kubenswrapper[4728]: E1216 14:59:26.442445 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:26.942413418 +0000 UTC m=+147.782592402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.537570 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ldfwg" podStartSLOduration=123.537551641 podStartE2EDuration="2m3.537551641s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:26.503989777 +0000 UTC m=+147.344168751" watchObservedRunningTime="2025-12-16 14:59:26.537551641 +0000 UTC m=+147.377730625" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.543472 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.543532 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.543561 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.543583 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.543605 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:26 crc kubenswrapper[4728]: E1216 14:59:26.544381 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:27.044357881 +0000 UTC m=+147.884536865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.544841 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.554016 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.554824 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.556003 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.632824 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.648938 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:26 crc kubenswrapper[4728]: E1216 14:59:26.649316 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:27.149300541 +0000 UTC m=+147.989479525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.654114 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" event={"ID":"5ee42482-d554-4944-8fce-e503c10c0ec9","Type":"ContainerStarted","Data":"88bd678c64e2e6bc4693e85ef9609db43817ca1c0f9dc64ac5c238844a1797c1"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.716449 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" event={"ID":"a752efc5-d365-4774-a134-f2199e58d26e","Type":"ContainerStarted","Data":"fb3c938e99050ec407a7fcb9e818238c498ca2dec0fe62ccbf1edfcc56b34678"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.736114 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.743192 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" event={"ID":"510146d5-b80f-404e-9aac-8e06a20a0c44","Type":"ContainerStarted","Data":"2d1eb585a347490102dfd9abd783c7003cb45dc5ffd39024922af3f9d28a514d"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.750648 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" podStartSLOduration=123.750632209 podStartE2EDuration="2m3.750632209s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:26.74870008 +0000 UTC m=+147.588879064" watchObservedRunningTime="2025-12-16 14:59:26.750632209 +0000 UTC m=+147.590811193" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.750913 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:26 crc kubenswrapper[4728]: E1216 14:59:26.752698 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:27.252684881 +0000 UTC m=+148.092863865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.769155 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.799460 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-55pkk" event={"ID":"05c53e6d-0610-4944-a6af-1fdb84368f05","Type":"ContainerStarted","Data":"7cc1d42162612b16c37a4beb9d249a4c22d6c902320ba645b580340ef7ead88c"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.809659 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mf9mx" event={"ID":"4723c39e-a85e-4c31-b938-6ace3fb5f700","Type":"ContainerStarted","Data":"be2bc75f48c0d22f0d5f9d70663093816eb0ddfb3f19c412d7e526296072a089"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.825651 4728 generic.go:334] "Generic (PLEG): container finished" podID="6f4e29bc-31e9-421e-abdf-c692d1dfb0c7" containerID="c77d04de547a04672ee44f3833ae0c7ad9f567c1bae57af4947dd66993907658" exitCode=0 Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.825703 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" event={"ID":"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7","Type":"ContainerDied","Data":"c77d04de547a04672ee44f3833ae0c7ad9f567c1bae57af4947dd66993907658"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.840232 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" event={"ID":"f9b2dba4-0e5d-45df-9d96-08f13c21c06a","Type":"ContainerStarted","Data":"f7fc5ed57379843f9dd5099e7bd64040b4c799fadeb7d24253ca1b1ef9038477"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.840270 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" event={"ID":"f9b2dba4-0e5d-45df-9d96-08f13c21c06a","Type":"ContainerStarted","Data":"b5c6e39f4bbb9647c85ab6863700d28cf098764ed884fa11382786c2c3d58157"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.841322 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" event={"ID":"4d1c9711-c6ae-4b4a-bafb-07891dd80514","Type":"ContainerStarted","Data":"3ff9e2f0463880f694b80018bf30ace6667c7fa1fc36b14bcf6053fd8d35a063"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.841359 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" event={"ID":"4d1c9711-c6ae-4b4a-bafb-07891dd80514","Type":"ContainerStarted","Data":"0e70913996d9ee9a22f7f138c759334382546e7610ff7f72a2c3c633f783a50a"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.842999 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q8nqs" event={"ID":"f2cfce66-2c6c-41ad-84f6-6726c3ea088d","Type":"ContainerStarted","Data":"7201fff4a36913db7ac99c9df84fd6af2d797cd2d74ed7b1cf4a6d4e4ee75e0e"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.844460 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" event={"ID":"c809652e-9ba8-4499-b2c5-4a0015048ea0","Type":"ContainerStarted","Data":"14ca12b26c73366bf03480f0ae04ed7865d6279a5b7400a69bc951dddf670c50"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.845774 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x6b6q" event={"ID":"7a82fd60-88d6-4e1a-b140-bf2f92ea0a87","Type":"ContainerStarted","Data":"b55d1c571c43900a6d6c3b375bec89769e317605b411fb9837dda873034694a8"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.848849 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-x6b6q" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.851570 4728 patch_prober.go:28] interesting pod/downloads-7954f5f757-x6b6q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.851601 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x6b6q" podUID="7a82fd60-88d6-4e1a-b140-bf2f92ea0a87" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.853856 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:26 crc kubenswrapper[4728]: E1216 14:59:26.855737 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:27.355718521 +0000 UTC m=+148.195897505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.858957 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" event={"ID":"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf","Type":"ContainerStarted","Data":"c5e41b060f63c748f410fa7ca33877b2198592217a99c826aa275dcba0f82979"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.858994 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" event={"ID":"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf","Type":"ContainerStarted","Data":"52040c599a95bb3a936989fab87ff71c5419f13ddb3d02f5806cc8fde247aba2"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.859787 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.861176 4728 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hnrqv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.861208 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" podUID="fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.877453 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" event={"ID":"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c","Type":"ContainerStarted","Data":"2c6e6ce4ba4bc7edf29b0cb1f2a1c26cff77697a923c0d66412d8fe75d7bd5ab"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.877504 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" event={"ID":"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c","Type":"ContainerStarted","Data":"176caf5ce4ffb3c270a4f04b619884d33067370f329626e33fce04a4d275f005"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.884126 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jwssj" podStartSLOduration=123.884105905 podStartE2EDuration="2m3.884105905s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:26.877784826 +0000 UTC m=+147.717963810" watchObservedRunningTime="2025-12-16 14:59:26.884105905 +0000 UTC m=+147.724284889" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.889933 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" event={"ID":"60329838-5948-4dc3-8ff1-cf2951d99197","Type":"ContainerStarted","Data":"bc7973acc330b11f3aebfd1a052f1813368d611b6b049c39ac0b6accb8494547"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.891733 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" event={"ID":"971f1003-73c5-40ca-8c3a-5479215b4e72","Type":"ContainerStarted","Data":"4609ac654f4f62ee1b9ec09b57bbb4ffbad5888033585c924cccfdfdef4f36ed"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.891757 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" event={"ID":"971f1003-73c5-40ca-8c3a-5479215b4e72","Type":"ContainerStarted","Data":"5fe814f259d99e08e3f09a4584bafd89e9607ff6cd831761042b3a2fe831caf1"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.894932 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x" event={"ID":"44dea519-de7c-4bc5-934e-5c251177b6fd","Type":"ContainerStarted","Data":"5bed0819f48a9c3b3d6e1d627069e110e6392ce56f690368a4a3efbc44386460"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.897495 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pfz7w"] Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.898121 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" event={"ID":"b7a24aa5-4c07-4f1f-984a-c3e4a73231e4","Type":"ContainerStarted","Data":"f56b16a721976b66c471059c3b787712975c1ccad7c0c60ec82be458dfd43bbb"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.898157 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.905580 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" event={"ID":"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d","Type":"ContainerStarted","Data":"b2c0f614c2dbcb5a5db082689540734fe0a9834c6a5c241a4648c3a9d439210f"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.905654 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" event={"ID":"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d","Type":"ContainerStarted","Data":"24c413d23c5fc3afe4bee5462992ddd78f6f43bd723bbdd3f2c5a2bb9e7f3251"} Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.934075 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-phlkm"] Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.937938 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m9d45"] Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.957363 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:26 crc kubenswrapper[4728]: E1216 14:59:26.967605 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:27.467586494 +0000 UTC m=+148.307765478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:26 crc kubenswrapper[4728]: I1216 14:59:26.995158 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ncw5z" podStartSLOduration=123.995133487 podStartE2EDuration="2m3.995133487s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:26.988870469 +0000 UTC m=+147.829049453" watchObservedRunningTime="2025-12-16 14:59:26.995133487 +0000 UTC m=+147.835312471" Dec 16 14:59:27 crc kubenswrapper[4728]: W1216 14:59:27.038986 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod693eed37_ae85_4a0b_a3e3_4e908245ac13.slice/crio-359e44af76de4798a19f22df1a684ad2ab060e060903829daaa70081cb6eaca1 WatchSource:0}: Error finding container 359e44af76de4798a19f22df1a684ad2ab060e060903829daaa70081cb6eaca1: Status 404 returned error can't find the container with id 359e44af76de4798a19f22df1a684ad2ab060e060903829daaa70081cb6eaca1 Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.064132 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.064476 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:27.56445448 +0000 UTC m=+148.404633464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.064945 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.065362 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:27.565295551 +0000 UTC m=+148.405474535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.077760 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" podStartSLOduration=124.077735614 podStartE2EDuration="2m4.077735614s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.03105386 +0000 UTC m=+147.871232834" watchObservedRunningTime="2025-12-16 14:59:27.077735614 +0000 UTC m=+147.917914598" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.090932 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.107977 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kqbmn"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.116695 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hw5mk"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.177872 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.178159 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:27.678140179 +0000 UTC m=+148.518319163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.181483 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.181813 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:27.681802391 +0000 UTC m=+148.521981375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.226751 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tznx9" podStartSLOduration=124.226731111 podStartE2EDuration="2m4.226731111s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.221889889 +0000 UTC m=+148.062068873" watchObservedRunningTime="2025-12-16 14:59:27.226731111 +0000 UTC m=+148.066910095" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.288232 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" podStartSLOduration=124.288209817 podStartE2EDuration="2m4.288209817s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.265689781 +0000 UTC m=+148.105868765" watchObservedRunningTime="2025-12-16 14:59:27.288209817 +0000 UTC m=+148.128388801" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.296017 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.296281 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:27.79626707 +0000 UTC m=+148.636446054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.300351 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k6z5"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.336071 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4tpkr"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.364912 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-55pkk" podStartSLOduration=124.364896015 podStartE2EDuration="2m4.364896015s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.363620493 +0000 UTC m=+148.203799487" watchObservedRunningTime="2025-12-16 14:59:27.364896015 +0000 UTC m=+148.205074999" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.372195 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.392521 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p2hzh" podStartSLOduration=124.392499649 podStartE2EDuration="2m4.392499649s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.39214552 +0000 UTC m=+148.232324504" watchObservedRunningTime="2025-12-16 14:59:27.392499649 +0000 UTC m=+148.232678623" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.397285 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.397659 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:27.897648119 +0000 UTC m=+148.737827103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.424057 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.432979 4728 patch_prober.go:28] interesting pod/router-default-5444994796-55pkk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:59:27 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Dec 16 14:59:27 crc kubenswrapper[4728]: [+]process-running ok Dec 16 14:59:27 crc kubenswrapper[4728]: healthz check failed Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.433026 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55pkk" podUID="05c53e6d-0610-4944-a6af-1fdb84368f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.486258 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.502011 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.502314 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.00229597 +0000 UTC m=+148.842474954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.509336 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.533015 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" podStartSLOduration=124.532997262 podStartE2EDuration="2m4.532997262s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.486115573 +0000 UTC m=+148.326294557" watchObservedRunningTime="2025-12-16 14:59:27.532997262 +0000 UTC m=+148.373176246" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.567454 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" podStartSLOduration=124.567433518 podStartE2EDuration="2m4.567433518s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.51500575 +0000 UTC m=+148.355184734" watchObservedRunningTime="2025-12-16 14:59:27.567433518 +0000 UTC m=+148.407612502" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.585175 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" podStartSLOduration=124.585156254 podStartE2EDuration="2m4.585156254s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.544132492 +0000 UTC m=+148.384311476" watchObservedRunningTime="2025-12-16 14:59:27.585156254 +0000 UTC m=+148.425335238" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.590064 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x" podStartSLOduration=124.590042777 podStartE2EDuration="2m4.590042777s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.573521722 +0000 UTC m=+148.413700706" watchObservedRunningTime="2025-12-16 14:59:27.590042777 +0000 UTC m=+148.430221761" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.602740 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-49d5p" podStartSLOduration=124.602720905 podStartE2EDuration="2m4.602720905s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.59816792 +0000 UTC m=+148.438346904" watchObservedRunningTime="2025-12-16 14:59:27.602720905 +0000 UTC m=+148.442899889" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.603351 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.608746 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.108732046 +0000 UTC m=+148.948911030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.639547 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-x6b6q" podStartSLOduration=124.639529921 podStartE2EDuration="2m4.639529921s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.638009363 +0000 UTC m=+148.478188337" watchObservedRunningTime="2025-12-16 14:59:27.639529921 +0000 UTC m=+148.479708905" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.654118 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.654258 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.654368 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.654455 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pr5wl"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.654547 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-87x9r"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.654628 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5t99c"] Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.695658 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8cvcc" podStartSLOduration=124.695638812 podStartE2EDuration="2m4.695638812s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:27.678829999 +0000 UTC m=+148.519008983" watchObservedRunningTime="2025-12-16 14:59:27.695638812 +0000 UTC m=+148.535817806" Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.708892 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.709330 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.209308606 +0000 UTC m=+149.049487590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.710740 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.711137 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.211124152 +0000 UTC m=+149.051303136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.811644 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.811907 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.311893126 +0000 UTC m=+149.152072110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:27 crc kubenswrapper[4728]: I1216 14:59:27.913501 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:27 crc kubenswrapper[4728]: E1216 14:59:27.914015 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.414003083 +0000 UTC m=+149.254182067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.015263 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.015347 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.515328631 +0000 UTC m=+149.355507615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.015731 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.016018 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.516010618 +0000 UTC m=+149.356189602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.027152 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" event={"ID":"b3bb3df3-c0ee-450a-827f-ff91fe54d0db","Type":"ContainerStarted","Data":"c988638d00001b80b8229cc4592300d4239d3fdff30855476c3ad16126e2cce2"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.039138 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pr5wl" event={"ID":"db457bae-59bc-4ec6-b5dd-8699c5794f76","Type":"ContainerStarted","Data":"cfc60d52c1500061a07d9c1f39939fb4a4fe915d8c2ee698ee4cfab05509d566"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.064840 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" event={"ID":"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7","Type":"ContainerStarted","Data":"a2df86c678b1eb31e835d3d42de0fccfbdb3b2dd317023d8900ed42c2ebc7046"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.072904 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" event={"ID":"6f4e29bc-31e9-421e-abdf-c692d1dfb0c7","Type":"ContainerStarted","Data":"32b3e009ced44109099d171f1a7e5f85d20a3e9545973afb2d85709403b8aabe"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.082275 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" event={"ID":"ca945ba8-363c-4e60-b11a-6938e4cb9354","Type":"ContainerStarted","Data":"369ffc81a7c153229455658439ebc6525c9ad8ed9561d510db94d7020fd86991"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.090791 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mf9mx" event={"ID":"4723c39e-a85e-4c31-b938-6ace3fb5f700","Type":"ContainerStarted","Data":"909b2f884e0b8c7fe5a0d85714e4c8ee0872def7b32ad210de0c77ed1f717b7d"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.113312 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv" event={"ID":"14a40ff4-9558-428f-a784-c18c5d62d60a","Type":"ContainerStarted","Data":"543018e68eee99f0afb160e0b4252268969be966c8efe7c3dfd61fd3bb560629"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.125132 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.125345 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.625323257 +0000 UTC m=+149.465502241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.125616 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m9d45" event={"ID":"95388006-228d-47cb-ab64-42cea04840bc","Type":"ContainerStarted","Data":"15895d32e2b0057f4b81aab9613cfd12d55e35a7bdae000423fb32773902fb06"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.125722 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m9d45" event={"ID":"95388006-228d-47cb-ab64-42cea04840bc","Type":"ContainerStarted","Data":"bf07933257130b1c8c9a691c9bd241bc052e1c62a3e48c73ada5e2cb1b3f80b0"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.126053 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.127203 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.627191924 +0000 UTC m=+149.467370908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.127972 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" podStartSLOduration=125.127952183 podStartE2EDuration="2m5.127952183s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:28.12501937 +0000 UTC m=+148.965198354" watchObservedRunningTime="2025-12-16 14:59:28.127952183 +0000 UTC m=+148.968131167" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.128167 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-87x9r" event={"ID":"31f40573-11e0-49ad-adeb-d0f013b07696","Type":"ContainerStarted","Data":"00210cf48856a61dd084733cb7f0b5483a15b22f6a41b314aa38da69559bba07"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.132544 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" event={"ID":"a58bcc67-9370-45bf-a2b4-96dec3c76b3f","Type":"ContainerStarted","Data":"b171169d64d7749297c53c68f5528cf36f487c0217f7c193a34ea448c5c03837"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.133902 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" event={"ID":"7d652ee4-8157-4667-a521-c75855e26796","Type":"ContainerStarted","Data":"3e3472c8b8aa5c96ae63d7d287adab3bfe9456b8789c6cfb8314c3f5bec00f84"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.133927 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" event={"ID":"7d652ee4-8157-4667-a521-c75855e26796","Type":"ContainerStarted","Data":"21c51d429576bd217d8aa72a1f44d7d0b751a3f0af9923e0b723a8d982846fac"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.151963 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3a317a2b903d3caad9fe80dd57596d84371bccc352dfe89a27b3f29f29cb7f78"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.162188 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mf9mx" podStartSLOduration=6.162171584 podStartE2EDuration="6.162171584s" podCreationTimestamp="2025-12-16 14:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:28.161191348 +0000 UTC m=+149.001370332" watchObservedRunningTime="2025-12-16 14:59:28.162171584 +0000 UTC m=+149.002350568" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.179833 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" event={"ID":"5ee42482-d554-4944-8fce-e503c10c0ec9","Type":"ContainerStarted","Data":"5c08b26bfa771fbba1bb147740bee123388efb90b05448b5b7ba03645eb0a7a0"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.182273 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv" podStartSLOduration=125.182257179 podStartE2EDuration="2m5.182257179s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:28.18148878 +0000 UTC m=+149.021667764" watchObservedRunningTime="2025-12-16 14:59:28.182257179 +0000 UTC m=+149.022436163" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.235208 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.236566 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.736537903 +0000 UTC m=+149.576716887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.243602 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-ggsk6" podStartSLOduration=125.243581331 podStartE2EDuration="2m5.243581331s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:28.242104783 +0000 UTC m=+149.082283767" watchObservedRunningTime="2025-12-16 14:59:28.243581331 +0000 UTC m=+149.083760315" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.251756 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" event={"ID":"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe","Type":"ContainerStarted","Data":"b3f819db80c19824e96b942a5a824bd5adb1330bc960ab3c5f16317cb971b8e6"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.251815 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" event={"ID":"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe","Type":"ContainerStarted","Data":"a63b2156abc11537e8e10c176da372f8bf15c9ce920575e6e93d38e48ad30732"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.305141 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" event={"ID":"3cea0eff-ca09-4d09-9f20-485c9ef6003b","Type":"ContainerStarted","Data":"b5874025dd262e310cab686850c2ac1c93c5b03b6f2f937588cc8da6aea363ca"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.309223 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" event={"ID":"4bebf51e-0577-4143-b337-f63a04d6a73d","Type":"ContainerStarted","Data":"27d41b994aba3ee07df45b5b014f721f97f7b45951130f8c95b9bc01b67e043e"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.318895 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" event={"ID":"e75ced0f-eb7d-4781-a34a-358c9e5db98a","Type":"ContainerStarted","Data":"315d3ba8b8a2fd8aa6f32891d6d05c514307dfa8c583eb802394f99f19cb425c"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.321005 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" event={"ID":"60329838-5948-4dc3-8ff1-cf2951d99197","Type":"ContainerStarted","Data":"7f994b00e533e19979a946d17e2a7305708190d19bc2c92fef905605c9191aa2"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.321027 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" event={"ID":"60329838-5948-4dc3-8ff1-cf2951d99197","Type":"ContainerStarted","Data":"24492f692a5792cbfc2ea9bb1b797b408a41d7cf46c9f166f982c82d6254a90a"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.339552 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.339894 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.839882922 +0000 UTC m=+149.680061906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.346834 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rkm4q" podStartSLOduration=125.346820297 podStartE2EDuration="2m5.346820297s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:28.341474352 +0000 UTC m=+149.181653336" watchObservedRunningTime="2025-12-16 14:59:28.346820297 +0000 UTC m=+149.186999271" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.396591 4728 generic.go:334] "Generic (PLEG): container finished" podID="179db76a-61fb-4ae3-a494-be67c44c7d65" containerID="79d5726c63c721a903859b9e50402a2d92399a3bc2e5ac2740711a1465944199" exitCode=0 Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.396703 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" event={"ID":"179db76a-61fb-4ae3-a494-be67c44c7d65","Type":"ContainerDied","Data":"79d5726c63c721a903859b9e50402a2d92399a3bc2e5ac2740711a1465944199"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.437206 4728 patch_prober.go:28] interesting pod/router-default-5444994796-55pkk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:59:28 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Dec 16 14:59:28 crc kubenswrapper[4728]: [+]process-running ok Dec 16 14:59:28 crc kubenswrapper[4728]: healthz check failed Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.437662 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55pkk" podUID="05c53e6d-0610-4944-a6af-1fdb84368f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.441008 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.441709 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:28.941675642 +0000 UTC m=+149.781854626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.447520 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-55pkk" event={"ID":"05c53e6d-0610-4944-a6af-1fdb84368f05","Type":"ContainerStarted","Data":"d538ccc6f35051b00a3038f578ad124ec9eca56bb5e36a5acefb22474a3c6872"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.492520 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" event={"ID":"4dc26d87-3ce0-4071-bf63-07cbe9e19d6c","Type":"ContainerStarted","Data":"62b02b9e2e14b00ae0fea043ee63ee828b4f9a3677a3afd26c00cb54ee3ef674"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.510606 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" event={"ID":"a5f01333-4573-4a05-b1cf-5dfdc95a33cd","Type":"ContainerStarted","Data":"60ebffa2f39bb13b43358af263f1f3f5c55c680dddfcd471ddda9665229c801d"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.510650 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" event={"ID":"a5f01333-4573-4a05-b1cf-5dfdc95a33cd","Type":"ContainerStarted","Data":"22e15c444c9f10909adfe88368223eacab723e8f98c03b2f7b64a22c3b7a5bff"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.516480 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" event={"ID":"7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700","Type":"ContainerStarted","Data":"6b93a8d2013e96a634486cbc5dc0fb0e9b6d610f350c03a6b13174ab66835a5d"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.524817 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9mv8j" podStartSLOduration=125.524799012 podStartE2EDuration="2m5.524799012s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:28.524741971 +0000 UTC m=+149.364920955" watchObservedRunningTime="2025-12-16 14:59:28.524799012 +0000 UTC m=+149.364977996" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.527836 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l9s9x" event={"ID":"44dea519-de7c-4bc5-934e-5c251177b6fd","Type":"ContainerStarted","Data":"9cc0d55a0e8fda1fb017bd82c4b01d1d2bb71ab0c7ed062f8e8c0d75f8e1c0e2"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.538720 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5t99c" event={"ID":"36b07558-0136-4600-86f4-d20044b9910d","Type":"ContainerStarted","Data":"9afca44f83a628477306341d23f8277c6da832da21a6ec6b82dbf13d06ade212"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.539787 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"99b941f814ddba6c152f1395563a2d93b6f179a4ae286ddbf452a988a60c5a3c"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.546880 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" event={"ID":"f8b4426a-81fb-42cc-92ff-9f488d660820","Type":"ContainerStarted","Data":"a3d1d0a80d8971210d61367d3e11e5914b66b28ebed9ffe155aee6cde6a24ee8"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.547917 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.548157 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:29.04814673 +0000 UTC m=+149.888325714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.558108 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c074197f3cb8e9ac6c53be8309d07cb0a5cbf228f39c65a31bfff72658f459d3"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.583240 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-hw5mk" podStartSLOduration=125.583223431 podStartE2EDuration="2m5.583223431s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:28.581383195 +0000 UTC m=+149.421562179" watchObservedRunningTime="2025-12-16 14:59:28.583223431 +0000 UTC m=+149.423402415" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.587949 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g66rv" event={"ID":"510146d5-b80f-404e-9aac-8e06a20a0c44","Type":"ContainerStarted","Data":"152a59ea87b4b70075e5553f4ae7d595dd0b415e7225cff298b57bb467c92f32"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.608252 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-phlkm" event={"ID":"693eed37-ae85-4a0b-a3e3-4e908245ac13","Type":"ContainerStarted","Data":"e98a17baeb7ee265ec6c909d8c470647a7be9b665c76068838bd7a54d2f36815"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.608294 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-phlkm" event={"ID":"693eed37-ae85-4a0b-a3e3-4e908245ac13","Type":"ContainerStarted","Data":"359e44af76de4798a19f22df1a684ad2ab060e060903829daaa70081cb6eaca1"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.608811 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.612836 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" event={"ID":"f9b2dba4-0e5d-45df-9d96-08f13c21c06a","Type":"ContainerStarted","Data":"6e1e0dfcb6d00027601ccbc0a161dece97944befbf6ed9340b54be56acf8435b"} Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.612860 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.616643 4728 patch_prober.go:28] interesting pod/downloads-7954f5f757-x6b6q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.616894 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x6b6q" podUID="7a82fd60-88d6-4e1a-b140-bf2f92ea0a87" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.629603 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.648910 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.649636 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:29.149621611 +0000 UTC m=+149.989800595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.649785 4728 patch_prober.go:28] interesting pod/console-operator-58897d9998-phlkm container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.649830 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-phlkm" podUID="693eed37-ae85-4a0b-a3e3-4e908245ac13" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.716717 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-phlkm" podStartSLOduration=125.716693398 podStartE2EDuration="2m5.716693398s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:28.709125197 +0000 UTC m=+149.549304181" watchObservedRunningTime="2025-12-16 14:59:28.716693398 +0000 UTC m=+149.556872442" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.751483 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.752300 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" podStartSLOduration=125.752278462 podStartE2EDuration="2m5.752278462s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:28.735661945 +0000 UTC m=+149.575840929" watchObservedRunningTime="2025-12-16 14:59:28.752278462 +0000 UTC m=+149.592457446" Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.756823 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:29.256810196 +0000 UTC m=+150.096989250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.855782 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.856084 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:29.356069372 +0000 UTC m=+150.196248356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:28 crc kubenswrapper[4728]: I1216 14:59:28.958427 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:28 crc kubenswrapper[4728]: E1216 14:59:28.958792 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:29.458773995 +0000 UTC m=+150.298952979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.060048 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.060308 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:29.560289667 +0000 UTC m=+150.400468651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.165627 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.165911 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:29.665900673 +0000 UTC m=+150.506079647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.267430 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.267591 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:29.76756774 +0000 UTC m=+150.607746724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.267686 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.267964 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:29.76795344 +0000 UTC m=+150.608132424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.368979 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.369627 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:29.869606546 +0000 UTC m=+150.709785530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.431544 4728 patch_prober.go:28] interesting pod/router-default-5444994796-55pkk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:59:29 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Dec 16 14:59:29 crc kubenswrapper[4728]: [+]process-running ok Dec 16 14:59:29 crc kubenswrapper[4728]: healthz check failed Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.431592 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55pkk" podUID="05c53e6d-0610-4944-a6af-1fdb84368f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.432545 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.432589 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.471704 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.471983 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:29.97197227 +0000 UTC m=+150.812151254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.572373 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.572590 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.07257403 +0000 UTC m=+150.912753014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.572687 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.573055 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.073044181 +0000 UTC m=+150.913223165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.617739 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e1d75e76b92f9da0a079d39ab363c341e0826c999d894f78660fa50a4346b564"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.621489 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" event={"ID":"7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700","Type":"ContainerStarted","Data":"28e4d171f516f83b4f32929686c33b9a7c2abe263b2d9746488fa80df6babdba"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.621529 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" event={"ID":"7e3a9eb1-9f6d-41f6-8bb9-86b8d8f42700","Type":"ContainerStarted","Data":"61c8dae5142c8c734f683aeb81fa28df279c93bf2ec044216bad3ea7b433868b"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.623625 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" event={"ID":"a58bcc67-9370-45bf-a2b4-96dec3c76b3f","Type":"ContainerStarted","Data":"eba23acb7560165ea7dbf3b75a6b1ab432dd57f6ed580c70e171d9323377ea06"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.623676 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" event={"ID":"a58bcc67-9370-45bf-a2b4-96dec3c76b3f","Type":"ContainerStarted","Data":"1e8ce7209c116ae21a3a50b7a58db1ba218bbaaa0b01909b6313490f8a3a94dd"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.624882 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" event={"ID":"b3bb3df3-c0ee-450a-827f-ff91fe54d0db","Type":"ContainerStarted","Data":"9ac0ca59d65db4ce7355599adef0be190ba61bfd79d39e19b977be9897fd98ad"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.625099 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.626287 4728 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-n7d4h container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.626325 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" podUID="b3bb3df3-c0ee-450a-827f-ff91fe54d0db" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.626472 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pr5wl" event={"ID":"db457bae-59bc-4ec6-b5dd-8699c5794f76","Type":"ContainerStarted","Data":"dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.633931 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m9d45" event={"ID":"95388006-228d-47cb-ab64-42cea04840bc","Type":"ContainerStarted","Data":"0b5f2fc46f92d12b2892ed9bfdd10ea56e28bf212cfe2c156b4901b1135412bd"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.634491 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.637087 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" event={"ID":"e75ced0f-eb7d-4781-a34a-358c9e5db98a","Type":"ContainerStarted","Data":"0e7bdbd53c97469b61fe33653ffb1b5757bc292eea24cc9e79fd1c7d756292b2"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.639912 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" event={"ID":"f8b4426a-81fb-42cc-92ff-9f488d660820","Type":"ContainerStarted","Data":"95d354a7f153eca71b5135156cd90dc00c073e837e9134a8c4b85f71b383f092"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.642440 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ea7cd57cc31d79c75ce71f937e850c2eda382f322a0c8e5e8d738f283249a0d1"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.645357 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" event={"ID":"179db76a-61fb-4ae3-a494-be67c44c7d65","Type":"ContainerStarted","Data":"bad29c2e7cb7a77ccad52b995c2ed6ddb780d57727a122f660694d6c59b14fe9"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.653462 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" event={"ID":"6ef09dcb-9a41-4fb0-8492-cdd81b0222fe","Type":"ContainerStarted","Data":"afaf71db00a4f76ea7a5508978deb7aa989fd5394961b98cca1c0ce29ed9adbf"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.659033 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-87x9r" event={"ID":"31f40573-11e0-49ad-adeb-d0f013b07696","Type":"ContainerStarted","Data":"34477407916e00946d73804d45808311e082d9941b2d48b1c70c7fc052baf4eb"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.667843 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5t99c" event={"ID":"36b07558-0136-4600-86f4-d20044b9910d","Type":"ContainerStarted","Data":"468b5accc1a367df17cffb6546a870916cf7265545c2bd6d3a2b01ebee44992d"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.673135 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.673282 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.173259202 +0000 UTC m=+151.013438186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.673399 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.673730 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.173712743 +0000 UTC m=+151.013891727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.674973 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" event={"ID":"7d652ee4-8157-4667-a521-c75855e26796","Type":"ContainerStarted","Data":"8a7e926cfb02ad1b19e013ef9fe5723c8600a9fc2f5cb679b600ae6d05d219f2"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.677374 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" event={"ID":"3cea0eff-ca09-4d09-9f20-485c9ef6003b","Type":"ContainerStarted","Data":"2162a974c5ae8c4fc6bebda544e9885dfbcebcf83090f5c9b5d3a66b63a3db76"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.679296 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.680455 4728 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z6qqt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.680506 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" podUID="3cea0eff-ca09-4d09-9f20-485c9ef6003b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.687766 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" event={"ID":"4bebf51e-0577-4143-b337-f63a04d6a73d","Type":"ContainerStarted","Data":"190d05d6623e297c4e618507ced9c080f0e9a3bc9c016ae83b7aa87ae149c575"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.688010 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.719763 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" event={"ID":"ca945ba8-363c-4e60-b11a-6938e4cb9354","Type":"ContainerStarted","Data":"3f7db6888f46974d186c856ac3e6417cc401141d6d4952e0fc6f96412525d753"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.721622 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.749709 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"623c603dd069a2eb44345e0b5a031bc6227b3cc334eba8bec6e0fddfc7215af3"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.749782 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.769602 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zf5xv" event={"ID":"14a40ff4-9558-428f-a784-c18c5d62d60a","Type":"ContainerStarted","Data":"61655b5e6ca109267133647152d7372e81608a889e8f59e8157a99bf7961c240"} Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.774253 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.775651 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.275634176 +0000 UTC m=+151.115813170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.825848 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-phlkm" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.882364 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.885238 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.385223442 +0000 UTC m=+151.225402426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.928160 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.929348 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.931384 4728 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-bh4dl container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.22:8443/livez\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.931459 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" podUID="179db76a-61fb-4ae3-a494-be67c44c7d65" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.22:8443/livez\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 16 14:59:29 crc kubenswrapper[4728]: I1216 14:59:29.984284 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:29 crc kubenswrapper[4728]: E1216 14:59:29.984635 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.484620611 +0000 UTC m=+151.324799595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.057077 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-kqbmn" podStartSLOduration=127.057064613 podStartE2EDuration="2m7.057064613s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.055097684 +0000 UTC m=+150.895276668" watchObservedRunningTime="2025-12-16 14:59:30.057064613 +0000 UTC m=+150.897243587" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.085992 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.086426 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.586388121 +0000 UTC m=+151.426567105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.138178 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-pr5wl" podStartSLOduration=127.138162162 podStartE2EDuration="2m7.138162162s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.136305865 +0000 UTC m=+150.976484859" watchObservedRunningTime="2025-12-16 14:59:30.138162162 +0000 UTC m=+150.978341146" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.187520 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.187722 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.687694148 +0000 UTC m=+151.527873132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.187807 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.188095 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.688083978 +0000 UTC m=+151.528262962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.288482 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.288712 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.788679848 +0000 UTC m=+151.628858852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.288772 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.289061 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.789038666 +0000 UTC m=+151.629217650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.390163 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.390362 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.890329473 +0000 UTC m=+151.730508457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.390439 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.390716 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.890704003 +0000 UTC m=+151.730882987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.408718 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" podStartSLOduration=127.408698166 podStartE2EDuration="2m7.408698166s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.40768158 +0000 UTC m=+151.247860564" watchObservedRunningTime="2025-12-16 14:59:30.408698166 +0000 UTC m=+151.248877150" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.410270 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qnflb" podStartSLOduration=127.410265184 podStartE2EDuration="2m7.410265184s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.297234543 +0000 UTC m=+151.137413527" watchObservedRunningTime="2025-12-16 14:59:30.410265184 +0000 UTC m=+151.250444168" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.424316 4728 patch_prober.go:28] interesting pod/router-default-5444994796-55pkk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:59:30 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Dec 16 14:59:30 crc kubenswrapper[4728]: [+]process-running ok Dec 16 14:59:30 crc kubenswrapper[4728]: healthz check failed Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.424377 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55pkk" podUID="05c53e6d-0610-4944-a6af-1fdb84368f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.491759 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.492314 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:30.992298478 +0000 UTC m=+151.832477462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.492736 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-25mqz" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.601156 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.602730 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.102716834 +0000 UTC m=+151.942895818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.678172 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" podStartSLOduration=127.678149131 podStartE2EDuration="2m7.678149131s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.628466222 +0000 UTC m=+151.468645196" watchObservedRunningTime="2025-12-16 14:59:30.678149131 +0000 UTC m=+151.518328115" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.702919 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.703269 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.203252563 +0000 UTC m=+152.043431547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.712450 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7r8ts" podStartSLOduration=127.712431553 podStartE2EDuration="2m7.712431553s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.681207678 +0000 UTC m=+151.521386662" watchObservedRunningTime="2025-12-16 14:59:30.712431553 +0000 UTC m=+151.552610537" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.713043 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" podStartSLOduration=127.713035438 podStartE2EDuration="2m7.713035438s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.707812778 +0000 UTC m=+151.547991782" watchObservedRunningTime="2025-12-16 14:59:30.713035438 +0000 UTC m=+151.553214422" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.771495 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-87x9r" podStartSLOduration=8.771479238 podStartE2EDuration="8.771479238s" podCreationTimestamp="2025-12-16 14:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.768859402 +0000 UTC m=+151.609038386" watchObservedRunningTime="2025-12-16 14:59:30.771479238 +0000 UTC m=+151.611658222" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.795492 4728 patch_prober.go:28] interesting pod/apiserver-76f77b778f-7rjsg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 16 14:59:30 crc kubenswrapper[4728]: [+]log ok Dec 16 14:59:30 crc kubenswrapper[4728]: [+]etcd ok Dec 16 14:59:30 crc kubenswrapper[4728]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 16 14:59:30 crc kubenswrapper[4728]: [+]poststarthook/generic-apiserver-start-informers ok Dec 16 14:59:30 crc kubenswrapper[4728]: [+]poststarthook/max-in-flight-filter ok Dec 16 14:59:30 crc kubenswrapper[4728]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 16 14:59:30 crc kubenswrapper[4728]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 16 14:59:30 crc kubenswrapper[4728]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 16 14:59:30 crc kubenswrapper[4728]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 16 14:59:30 crc kubenswrapper[4728]: [+]poststarthook/project.openshift.io-projectcache ok Dec 16 14:59:30 crc kubenswrapper[4728]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 16 14:59:30 crc kubenswrapper[4728]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Dec 16 14:59:30 crc kubenswrapper[4728]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 16 14:59:30 crc kubenswrapper[4728]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 16 14:59:30 crc kubenswrapper[4728]: livez check failed Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.795553 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" podUID="6f4e29bc-31e9-421e-abdf-c692d1dfb0c7" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.808393 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.808729 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.308717225 +0000 UTC m=+152.148896209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.819443 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5t99c" event={"ID":"36b07558-0136-4600-86f4-d20044b9910d","Type":"ContainerStarted","Data":"c4bf78fc7e43e31610d43aeac72c7830bebec4f72847b29e044f9ff60c79cccd"} Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.822498 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.831694 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-drpzn" podStartSLOduration=127.831673782 podStartE2EDuration="2m7.831673782s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.829932818 +0000 UTC m=+151.670111802" watchObservedRunningTime="2025-12-16 14:59:30.831673782 +0000 UTC m=+151.671852766" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.843592 4728 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8k6z5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.843637 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" podUID="ca945ba8-363c-4e60-b11a-6938e4cb9354" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.854893 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n7d4h" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.900933 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-4tpkr" podStartSLOduration=127.900916103 podStartE2EDuration="2m7.900916103s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.870906909 +0000 UTC m=+151.711085893" watchObservedRunningTime="2025-12-16 14:59:30.900916103 +0000 UTC m=+151.741095087" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.901619 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-pfz7w" podStartSLOduration=127.901612091 podStartE2EDuration="2m7.901612091s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.898878072 +0000 UTC m=+151.739057056" watchObservedRunningTime="2025-12-16 14:59:30.901612091 +0000 UTC m=+151.741791075" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.909724 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:30 crc kubenswrapper[4728]: E1216 14:59:30.914146 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.414122715 +0000 UTC m=+152.254301699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.948266 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" podStartSLOduration=127.948230332 podStartE2EDuration="2m7.948230332s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.939729239 +0000 UTC m=+151.779908233" watchObservedRunningTime="2025-12-16 14:59:30.948230332 +0000 UTC m=+151.788409316" Dec 16 14:59:30 crc kubenswrapper[4728]: I1216 14:59:30.998703 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-m9d45" podStartSLOduration=8.998681251 podStartE2EDuration="8.998681251s" podCreationTimestamp="2025-12-16 14:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:30.994914147 +0000 UTC m=+151.835093131" watchObservedRunningTime="2025-12-16 14:59:30.998681251 +0000 UTC m=+151.838860235" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.012152 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.012478 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.512464108 +0000 UTC m=+152.352643092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.028219 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nt6nd" podStartSLOduration=128.028199884 podStartE2EDuration="2m8.028199884s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:31.027481666 +0000 UTC m=+151.867660670" watchObservedRunningTime="2025-12-16 14:59:31.028199884 +0000 UTC m=+151.868378868" Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.115289 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.615270443 +0000 UTC m=+152.455449427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.115314 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.115730 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.116051 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.616031252 +0000 UTC m=+152.456210236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.216306 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.216540 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.716505309 +0000 UTC m=+152.556684293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.216786 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.217147 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.717139804 +0000 UTC m=+152.557318788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.318535 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.318723 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.818696719 +0000 UTC m=+152.658875703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.318855 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.319139 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.81912739 +0000 UTC m=+152.659306374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.386920 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rgppn"] Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.387873 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgppn" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.390116 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.400702 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rgppn"] Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.426072 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.426162 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.92614004 +0000 UTC m=+152.766319024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.426227 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.426570 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:31.926562981 +0000 UTC m=+152.766741965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.430874 4728 patch_prober.go:28] interesting pod/router-default-5444994796-55pkk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:59:31 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Dec 16 14:59:31 crc kubenswrapper[4728]: [+]process-running ok Dec 16 14:59:31 crc kubenswrapper[4728]: healthz check failed Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.430967 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55pkk" podUID="05c53e6d-0610-4944-a6af-1fdb84368f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.527588 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.527777 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-catalog-content\") pod \"community-operators-rgppn\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " pod="openshift-marketplace/community-operators-rgppn" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.527858 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtvx2\" (UniqueName: \"kubernetes.io/projected/c5d6795c-254f-428c-9fc2-c37b2e224b54-kube-api-access-qtvx2\") pod \"community-operators-rgppn\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " pod="openshift-marketplace/community-operators-rgppn" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.527911 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-utilities\") pod \"community-operators-rgppn\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " pod="openshift-marketplace/community-operators-rgppn" Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.528013 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.027997521 +0000 UTC m=+152.868176505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.586346 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-78qzz"] Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.587372 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-78qzz" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.589212 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.599027 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-78qzz"] Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.628964 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-utilities\") pod \"community-operators-rgppn\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " pod="openshift-marketplace/community-operators-rgppn" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.629001 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-catalog-content\") pod \"community-operators-rgppn\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " pod="openshift-marketplace/community-operators-rgppn" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.629060 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtvx2\" (UniqueName: \"kubernetes.io/projected/c5d6795c-254f-428c-9fc2-c37b2e224b54-kube-api-access-qtvx2\") pod \"community-operators-rgppn\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " pod="openshift-marketplace/community-operators-rgppn" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.629090 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.629376 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.12936465 +0000 UTC m=+152.969543634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.629842 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-utilities\") pod \"community-operators-rgppn\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " pod="openshift-marketplace/community-operators-rgppn" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.630218 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-catalog-content\") pod \"community-operators-rgppn\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " pod="openshift-marketplace/community-operators-rgppn" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.632520 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6qqt" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.660385 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtvx2\" (UniqueName: \"kubernetes.io/projected/c5d6795c-254f-428c-9fc2-c37b2e224b54-kube-api-access-qtvx2\") pod \"community-operators-rgppn\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " pod="openshift-marketplace/community-operators-rgppn" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.705455 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgppn" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.729758 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.729978 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-utilities\") pod \"certified-operators-78qzz\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " pod="openshift-marketplace/certified-operators-78qzz" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.730005 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-catalog-content\") pod \"certified-operators-78qzz\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " pod="openshift-marketplace/certified-operators-78qzz" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.730055 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fffqc\" (UniqueName: \"kubernetes.io/projected/d5541d34-e213-4545-af81-6410a52db88d-kube-api-access-fffqc\") pod \"certified-operators-78qzz\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " pod="openshift-marketplace/certified-operators-78qzz" Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.730212 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.230182975 +0000 UTC m=+153.070361949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.814079 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hlwvp"] Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.814906 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlwvp" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.838072 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fffqc\" (UniqueName: \"kubernetes.io/projected/d5541d34-e213-4545-af81-6410a52db88d-kube-api-access-fffqc\") pod \"certified-operators-78qzz\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " pod="openshift-marketplace/certified-operators-78qzz" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.838133 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.838180 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-utilities\") pod \"certified-operators-78qzz\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " pod="openshift-marketplace/certified-operators-78qzz" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.838205 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-catalog-content\") pod \"certified-operators-78qzz\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " pod="openshift-marketplace/certified-operators-78qzz" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.838643 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-catalog-content\") pod \"certified-operators-78qzz\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " pod="openshift-marketplace/certified-operators-78qzz" Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.839099 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.339088044 +0000 UTC m=+153.179267028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.839449 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-utilities\") pod \"certified-operators-78qzz\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " pod="openshift-marketplace/certified-operators-78qzz" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.844180 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlwvp"] Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.851570 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5t99c" event={"ID":"36b07558-0136-4600-86f4-d20044b9910d","Type":"ContainerStarted","Data":"500bd3c390671f93ed0dff9c107e8d5168d07aa86092c51cb50aa9230f6ed447"} Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.867720 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.873339 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fffqc\" (UniqueName: \"kubernetes.io/projected/d5541d34-e213-4545-af81-6410a52db88d-kube-api-access-fffqc\") pod \"certified-operators-78qzz\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " pod="openshift-marketplace/certified-operators-78qzz" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.909517 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-78qzz" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.938895 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.939031 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.439003657 +0000 UTC m=+153.279182631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.939470 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-catalog-content\") pod \"community-operators-hlwvp\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " pod="openshift-marketplace/community-operators-hlwvp" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.940795 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-utilities\") pod \"community-operators-hlwvp\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " pod="openshift-marketplace/community-operators-hlwvp" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.940830 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvg5p\" (UniqueName: \"kubernetes.io/projected/4dccc964-0fe8-499e-a852-d971100829c1-kube-api-access-mvg5p\") pod \"community-operators-hlwvp\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " pod="openshift-marketplace/community-operators-hlwvp" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.940897 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:31 crc kubenswrapper[4728]: E1216 14:59:31.941709 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.441695615 +0000 UTC m=+153.281874599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.981792 4728 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.984817 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9zk7d"] Dec 16 14:59:31 crc kubenswrapper[4728]: I1216 14:59:31.986242 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.005976 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zk7d"] Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.042526 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.042781 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-catalog-content\") pod \"community-operators-hlwvp\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " pod="openshift-marketplace/community-operators-hlwvp" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.042875 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-utilities\") pod \"community-operators-hlwvp\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " pod="openshift-marketplace/community-operators-hlwvp" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.042900 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvg5p\" (UniqueName: \"kubernetes.io/projected/4dccc964-0fe8-499e-a852-d971100829c1-kube-api-access-mvg5p\") pod \"community-operators-hlwvp\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " pod="openshift-marketplace/community-operators-hlwvp" Dec 16 14:59:32 crc kubenswrapper[4728]: E1216 14:59:32.043256 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.543239118 +0000 UTC m=+153.383418092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.043600 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-catalog-content\") pod \"community-operators-hlwvp\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " pod="openshift-marketplace/community-operators-hlwvp" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.043810 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-utilities\") pod \"community-operators-hlwvp\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " pod="openshift-marketplace/community-operators-hlwvp" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.073434 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvg5p\" (UniqueName: \"kubernetes.io/projected/4dccc964-0fe8-499e-a852-d971100829c1-kube-api-access-mvg5p\") pod \"community-operators-hlwvp\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " pod="openshift-marketplace/community-operators-hlwvp" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.144714 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-utilities\") pod \"certified-operators-9zk7d\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.144963 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlwvp" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.145012 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgf4d\" (UniqueName: \"kubernetes.io/projected/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-kube-api-access-qgf4d\") pod \"certified-operators-9zk7d\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.145159 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-catalog-content\") pod \"certified-operators-9zk7d\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.145247 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:32 crc kubenswrapper[4728]: E1216 14:59:32.149822 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.649803678 +0000 UTC m=+153.489982662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.178121 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rgppn"] Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.249489 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:32 crc kubenswrapper[4728]: E1216 14:59:32.249719 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.749680069 +0000 UTC m=+153.589859063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.249820 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-utilities\") pod \"certified-operators-9zk7d\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.249920 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgf4d\" (UniqueName: \"kubernetes.io/projected/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-kube-api-access-qgf4d\") pod \"certified-operators-9zk7d\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.250009 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-catalog-content\") pod \"certified-operators-9zk7d\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.250079 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.250436 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-utilities\") pod \"certified-operators-9zk7d\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 14:59:32 crc kubenswrapper[4728]: E1216 14:59:32.250481 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.750468689 +0000 UTC m=+153.590647673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.250927 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-catalog-content\") pod \"certified-operators-9zk7d\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.270130 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgf4d\" (UniqueName: \"kubernetes.io/projected/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-kube-api-access-qgf4d\") pod \"certified-operators-9zk7d\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.327986 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.338380 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlwvp"] Dec 16 14:59:32 crc kubenswrapper[4728]: W1216 14:59:32.345608 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dccc964_0fe8_499e_a852_d971100829c1.slice/crio-a3cec02942e1678ae760a51c6373a09173f6ef2595bc4e31ec2e2eb1b8637e31 WatchSource:0}: Error finding container a3cec02942e1678ae760a51c6373a09173f6ef2595bc4e31ec2e2eb1b8637e31: Status 404 returned error can't find the container with id a3cec02942e1678ae760a51c6373a09173f6ef2595bc4e31ec2e2eb1b8637e31 Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.351794 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:32 crc kubenswrapper[4728]: E1216 14:59:32.351943 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.85189861 +0000 UTC m=+153.692077594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.352130 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:32 crc kubenswrapper[4728]: E1216 14:59:32.352491 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.852483425 +0000 UTC m=+153.692662409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.423620 4728 patch_prober.go:28] interesting pod/router-default-5444994796-55pkk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:59:32 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Dec 16 14:59:32 crc kubenswrapper[4728]: [+]process-running ok Dec 16 14:59:32 crc kubenswrapper[4728]: healthz check failed Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.423865 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55pkk" podUID="05c53e6d-0610-4944-a6af-1fdb84368f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.434540 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-78qzz"] Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.453392 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:32 crc kubenswrapper[4728]: E1216 14:59:32.453728 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:32.95371152 +0000 UTC m=+153.793890504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:32 crc kubenswrapper[4728]: W1216 14:59:32.458203 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5541d34_e213_4545_af81_6410a52db88d.slice/crio-a29edac15c726cff514496e4484b8f4f5c3e2858fdbbed5651bf1ce543bfa626 WatchSource:0}: Error finding container a29edac15c726cff514496e4484b8f4f5c3e2858fdbbed5651bf1ce543bfa626: Status 404 returned error can't find the container with id a29edac15c726cff514496e4484b8f4f5c3e2858fdbbed5651bf1ce543bfa626 Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.554438 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:32 crc kubenswrapper[4728]: E1216 14:59:32.554785 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-16 14:59:33.054769351 +0000 UTC m=+153.894948335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m9gpr" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.571925 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zk7d"] Dec 16 14:59:32 crc kubenswrapper[4728]: W1216 14:59:32.611993 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b64cba1_1ce1_4715_bd9f_831d3db30fc2.slice/crio-64fbd43d3eba207cbc8bddd1f145917f19a20a8663de7d551f80e7eef854a77d WatchSource:0}: Error finding container 64fbd43d3eba207cbc8bddd1f145917f19a20a8663de7d551f80e7eef854a77d: Status 404 returned error can't find the container with id 64fbd43d3eba207cbc8bddd1f145917f19a20a8663de7d551f80e7eef854a77d Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.655716 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:32 crc kubenswrapper[4728]: E1216 14:59:32.656087 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-16 14:59:33.156072599 +0000 UTC m=+153.996251573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.689501 4728 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-16T14:59:31.981822984Z","Handler":null,"Name":""} Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.696752 4728 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.696781 4728 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.757164 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.759637 4728 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.759669 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.776822 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m9gpr\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.857680 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.860183 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zk7d" event={"ID":"8b64cba1-1ce1-4715-bd9f-831d3db30fc2","Type":"ContainerStarted","Data":"64fbd43d3eba207cbc8bddd1f145917f19a20a8663de7d551f80e7eef854a77d"} Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.863855 4728 generic.go:334] "Generic (PLEG): container finished" podID="c5d6795c-254f-428c-9fc2-c37b2e224b54" containerID="c748d69a2439a0f9c53d84e0ab65d50b185532bbab20e210a06b1d69b610e1b4" exitCode=0 Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.863917 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgppn" event={"ID":"c5d6795c-254f-428c-9fc2-c37b2e224b54","Type":"ContainerDied","Data":"c748d69a2439a0f9c53d84e0ab65d50b185532bbab20e210a06b1d69b610e1b4"} Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.863937 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgppn" event={"ID":"c5d6795c-254f-428c-9fc2-c37b2e224b54","Type":"ContainerStarted","Data":"b930c8141146b334971e96d95434527a03670ae2a2b4387fd972312f715440ac"} Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.864775 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.865548 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.867065 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5t99c" event={"ID":"36b07558-0136-4600-86f4-d20044b9910d","Type":"ContainerStarted","Data":"cd2643f20f9c7c43ab6d66411a906ddf9d1e11d9ecd765fab11e687a7978a924"} Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.870173 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5541d34-e213-4545-af81-6410a52db88d" containerID="bad6df5493f9eaa82024fc4ece34b0f8e49ba896e9adc0aef31b2a69ccea7bbe" exitCode=0 Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.870226 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78qzz" event={"ID":"d5541d34-e213-4545-af81-6410a52db88d","Type":"ContainerDied","Data":"bad6df5493f9eaa82024fc4ece34b0f8e49ba896e9adc0aef31b2a69ccea7bbe"} Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.870244 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78qzz" event={"ID":"d5541d34-e213-4545-af81-6410a52db88d","Type":"ContainerStarted","Data":"a29edac15c726cff514496e4484b8f4f5c3e2858fdbbed5651bf1ce543bfa626"} Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.871995 4728 generic.go:334] "Generic (PLEG): container finished" podID="4dccc964-0fe8-499e-a852-d971100829c1" containerID="fe244bc3aac51d1344f710b4942bf00dee6b233f125ddac8c4efb94f39f8d39c" exitCode=0 Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.872099 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwvp" event={"ID":"4dccc964-0fe8-499e-a852-d971100829c1","Type":"ContainerDied","Data":"fe244bc3aac51d1344f710b4942bf00dee6b233f125ddac8c4efb94f39f8d39c"} Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.872130 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwvp" event={"ID":"4dccc964-0fe8-499e-a852-d971100829c1","Type":"ContainerStarted","Data":"a3cec02942e1678ae760a51c6373a09173f6ef2595bc4e31ec2e2eb1b8637e31"} Dec 16 14:59:32 crc kubenswrapper[4728]: I1216 14:59:32.920435 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5t99c" podStartSLOduration=10.920395475 podStartE2EDuration="10.920395475s" podCreationTimestamp="2025-12-16 14:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:32.918830227 +0000 UTC m=+153.759009211" watchObservedRunningTime="2025-12-16 14:59:32.920395475 +0000 UTC m=+153.760574459" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.003843 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.043568 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.305373 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m9gpr"] Dec 16 14:59:33 crc kubenswrapper[4728]: W1216 14:59:33.310682 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod269fe7e0_633b_41d4_8a8f_cd39424229e4.slice/crio-9c3644477a754c1cbe91fe1f55bc8a964f8775668826a8878f3423b7dbc6d001 WatchSource:0}: Error finding container 9c3644477a754c1cbe91fe1f55bc8a964f8775668826a8878f3423b7dbc6d001: Status 404 returned error can't find the container with id 9c3644477a754c1cbe91fe1f55bc8a964f8775668826a8878f3423b7dbc6d001 Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.392331 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bxlrz"] Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.394404 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.397681 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.415230 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxlrz"] Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.427798 4728 patch_prober.go:28] interesting pod/router-default-5444994796-55pkk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:59:33 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Dec 16 14:59:33 crc kubenswrapper[4728]: [+]process-running ok Dec 16 14:59:33 crc kubenswrapper[4728]: healthz check failed Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.427889 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55pkk" podUID="05c53e6d-0610-4944-a6af-1fdb84368f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.466104 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-utilities\") pod \"redhat-marketplace-bxlrz\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.466164 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-catalog-content\") pod \"redhat-marketplace-bxlrz\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.466214 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqwm5\" (UniqueName: \"kubernetes.io/projected/c89065be-d4d7-4201-b4fd-f1bc18df6a60-kube-api-access-vqwm5\") pod \"redhat-marketplace-bxlrz\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.513150 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.567778 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqwm5\" (UniqueName: \"kubernetes.io/projected/c89065be-d4d7-4201-b4fd-f1bc18df6a60-kube-api-access-vqwm5\") pod \"redhat-marketplace-bxlrz\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.568006 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-utilities\") pod \"redhat-marketplace-bxlrz\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.568120 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-catalog-content\") pod \"redhat-marketplace-bxlrz\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.568509 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-utilities\") pod \"redhat-marketplace-bxlrz\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.568620 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-catalog-content\") pod \"redhat-marketplace-bxlrz\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.588762 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqwm5\" (UniqueName: \"kubernetes.io/projected/c89065be-d4d7-4201-b4fd-f1bc18df6a60-kube-api-access-vqwm5\") pod \"redhat-marketplace-bxlrz\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.722031 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.785468 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fs9cj"] Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.788486 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.794176 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs9cj"] Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.876584 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5fr2\" (UniqueName: \"kubernetes.io/projected/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-kube-api-access-f5fr2\") pod \"redhat-marketplace-fs9cj\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.877108 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-utilities\") pod \"redhat-marketplace-fs9cj\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.877149 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-catalog-content\") pod \"redhat-marketplace-fs9cj\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.883771 4728 generic.go:334] "Generic (PLEG): container finished" podID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" containerID="5cfef1fd9d2a8befb2715a36800a27972d91a865ef4aab08f8e27030a05488f0" exitCode=0 Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.883832 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zk7d" event={"ID":"8b64cba1-1ce1-4715-bd9f-831d3db30fc2","Type":"ContainerDied","Data":"5cfef1fd9d2a8befb2715a36800a27972d91a865ef4aab08f8e27030a05488f0"} Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.887338 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" event={"ID":"269fe7e0-633b-41d4-8a8f-cd39424229e4","Type":"ContainerStarted","Data":"66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461"} Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.887385 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" event={"ID":"269fe7e0-633b-41d4-8a8f-cd39424229e4","Type":"ContainerStarted","Data":"9c3644477a754c1cbe91fe1f55bc8a964f8775668826a8878f3423b7dbc6d001"} Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.887457 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.890851 4728 generic.go:334] "Generic (PLEG): container finished" podID="f0fa8699-bd81-4174-a0c1-6b9a3519ab0d" containerID="b2c0f614c2dbcb5a5db082689540734fe0a9834c6a5c241a4648c3a9d439210f" exitCode=0 Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.890950 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" event={"ID":"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d","Type":"ContainerDied","Data":"b2c0f614c2dbcb5a5db082689540734fe0a9834c6a5c241a4648c3a9d439210f"} Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.942768 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" podStartSLOduration=130.942743104 podStartE2EDuration="2m10.942743104s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:33.940124198 +0000 UTC m=+154.780303192" watchObservedRunningTime="2025-12-16 14:59:33.942743104 +0000 UTC m=+154.782922098" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.960898 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxlrz"] Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.985086 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5fr2\" (UniqueName: \"kubernetes.io/projected/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-kube-api-access-f5fr2\") pod \"redhat-marketplace-fs9cj\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.985127 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-utilities\") pod \"redhat-marketplace-fs9cj\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.985169 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-catalog-content\") pod \"redhat-marketplace-fs9cj\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.985797 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-catalog-content\") pod \"redhat-marketplace-fs9cj\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 14:59:33 crc kubenswrapper[4728]: I1216 14:59:33.986622 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-utilities\") pod \"redhat-marketplace-fs9cj\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.001728 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5fr2\" (UniqueName: \"kubernetes.io/projected/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-kube-api-access-f5fr2\") pod \"redhat-marketplace-fs9cj\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.047185 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.049958 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.062370 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.062978 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.071607 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.158496 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.187805 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f4acbab1-e9c2-4c9f-a676-a7e7a3099114\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.188255 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f4acbab1-e9c2-4c9f-a676-a7e7a3099114\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.289264 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f4acbab1-e9c2-4c9f-a676-a7e7a3099114\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.289569 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f4acbab1-e9c2-4c9f-a676-a7e7a3099114\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.289653 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f4acbab1-e9c2-4c9f-a676-a7e7a3099114\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.308806 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f4acbab1-e9c2-4c9f-a676-a7e7a3099114\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.385948 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.425018 4728 patch_prober.go:28] interesting pod/router-default-5444994796-55pkk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:59:34 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Dec 16 14:59:34 crc kubenswrapper[4728]: [+]process-running ok Dec 16 14:59:34 crc kubenswrapper[4728]: healthz check failed Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.425077 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55pkk" podUID="05c53e6d-0610-4944-a6af-1fdb84368f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.437171 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.458977 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7rjsg" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.600370 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6vw5z"] Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.603636 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.604607 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6vw5z"] Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.605358 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.639815 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs9cj"] Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.653004 4728 patch_prober.go:28] interesting pod/downloads-7954f5f757-x6b6q container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.653058 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x6b6q" podUID="7a82fd60-88d6-4e1a-b140-bf2f92ea0a87" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.653121 4728 patch_prober.go:28] interesting pod/downloads-7954f5f757-x6b6q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.653174 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x6b6q" podUID="7a82fd60-88d6-4e1a-b140-bf2f92ea0a87" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.694424 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbxdf\" (UniqueName: \"kubernetes.io/projected/e74a33ea-23b7-47fc-a463-566f8b579917-kube-api-access-pbxdf\") pod \"redhat-operators-6vw5z\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.694711 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-utilities\") pod \"redhat-operators-6vw5z\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.694739 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-catalog-content\") pod \"redhat-operators-6vw5z\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.759329 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.796506 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-utilities\") pod \"redhat-operators-6vw5z\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.796546 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbxdf\" (UniqueName: \"kubernetes.io/projected/e74a33ea-23b7-47fc-a463-566f8b579917-kube-api-access-pbxdf\") pod \"redhat-operators-6vw5z\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.796576 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-catalog-content\") pod \"redhat-operators-6vw5z\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.797016 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-catalog-content\") pod \"redhat-operators-6vw5z\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.797276 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-utilities\") pod \"redhat-operators-6vw5z\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.816149 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbxdf\" (UniqueName: \"kubernetes.io/projected/e74a33ea-23b7-47fc-a463-566f8b579917-kube-api-access-pbxdf\") pod \"redhat-operators-6vw5z\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.902081 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4acbab1-e9c2-4c9f-a676-a7e7a3099114","Type":"ContainerStarted","Data":"f8cb16dbce50e6c735ca3ea6de8230c8010595a99959abc500e2449b227699a1"} Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.904325 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs9cj" event={"ID":"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa","Type":"ContainerStarted","Data":"46710557a3ddc54d976c951a44b2f53e2f820d11981c0075af285f1ae874b56b"} Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.916198 4728 generic.go:334] "Generic (PLEG): container finished" podID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" containerID="560a2330fd10ae2a1ca2d67b32d59081811a8de6d20cecea3666d64ab7e253e5" exitCode=0 Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.917945 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxlrz" event={"ID":"c89065be-d4d7-4201-b4fd-f1bc18df6a60","Type":"ContainerDied","Data":"560a2330fd10ae2a1ca2d67b32d59081811a8de6d20cecea3666d64ab7e253e5"} Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.917990 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxlrz" event={"ID":"c89065be-d4d7-4201-b4fd-f1bc18df6a60","Type":"ContainerStarted","Data":"17b14ce578a6fdec235d42d21ddfffc03c5827aaa3b49f8b8244f19f4f939058"} Dec 16 14:59:34 crc kubenswrapper[4728]: I1216 14:59:34.947984 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.003423 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5fdgm"] Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.004424 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.012025 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fdgm"] Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.081679 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.102182 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bh4dl" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.102363 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-catalog-content\") pod \"redhat-operators-5fdgm\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.102501 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g97c5\" (UniqueName: \"kubernetes.io/projected/b722868c-334c-4e92-a919-051140c48283-kube-api-access-g97c5\") pod \"redhat-operators-5fdgm\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.102535 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-utilities\") pod \"redhat-operators-5fdgm\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.204059 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g97c5\" (UniqueName: \"kubernetes.io/projected/b722868c-334c-4e92-a919-051140c48283-kube-api-access-g97c5\") pod \"redhat-operators-5fdgm\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.204118 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-utilities\") pod \"redhat-operators-5fdgm\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.204228 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-catalog-content\") pod \"redhat-operators-5fdgm\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.204707 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-catalog-content\") pod \"redhat-operators-5fdgm\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.204921 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-utilities\") pod \"redhat-operators-5fdgm\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.236735 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g97c5\" (UniqueName: \"kubernetes.io/projected/b722868c-334c-4e92-a919-051140c48283-kube-api-access-g97c5\") pod \"redhat-operators-5fdgm\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.347282 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6vw5z"] Dec 16 14:59:35 crc kubenswrapper[4728]: W1216 14:59:35.387569 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode74a33ea_23b7_47fc_a463_566f8b579917.slice/crio-46699ea4e925835ddc8acfedf187576712037b7401ba4a42cc75b59e8e2648b2 WatchSource:0}: Error finding container 46699ea4e925835ddc8acfedf187576712037b7401ba4a42cc75b59e8e2648b2: Status 404 returned error can't find the container with id 46699ea4e925835ddc8acfedf187576712037b7401ba4a42cc75b59e8e2648b2 Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.400291 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.421653 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.425348 4728 patch_prober.go:28] interesting pod/router-default-5444994796-55pkk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:59:35 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Dec 16 14:59:35 crc kubenswrapper[4728]: [+]process-running ok Dec 16 14:59:35 crc kubenswrapper[4728]: healthz check failed Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.425448 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55pkk" podUID="05c53e6d-0610-4944-a6af-1fdb84368f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.474270 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.524048 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-config-volume\") pod \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.524180 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-secret-volume\") pod \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.524243 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l87z5\" (UniqueName: \"kubernetes.io/projected/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-kube-api-access-l87z5\") pod \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\" (UID: \"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d\") " Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.525986 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-config-volume" (OuterVolumeSpecName: "config-volume") pod "f0fa8699-bd81-4174-a0c1-6b9a3519ab0d" (UID: "f0fa8699-bd81-4174-a0c1-6b9a3519ab0d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.530646 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f0fa8699-bd81-4174-a0c1-6b9a3519ab0d" (UID: "f0fa8699-bd81-4174-a0c1-6b9a3519ab0d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.534896 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-kube-api-access-l87z5" (OuterVolumeSpecName: "kube-api-access-l87z5") pod "f0fa8699-bd81-4174-a0c1-6b9a3519ab0d" (UID: "f0fa8699-bd81-4174-a0c1-6b9a3519ab0d"). InnerVolumeSpecName "kube-api-access-l87z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.541710 4728 patch_prober.go:28] interesting pod/console-f9d7485db-pr5wl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.541794 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pr5wl" podUID="db457bae-59bc-4ec6-b5dd-8699c5794f76" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.592828 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.592862 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.628342 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.628389 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.628424 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l87z5\" (UniqueName: \"kubernetes.io/projected/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d-kube-api-access-l87z5\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.890890 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fdgm"] Dec 16 14:59:35 crc kubenswrapper[4728]: W1216 14:59:35.924894 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb722868c_334c_4e92_a919_051140c48283.slice/crio-30374b7be49ab8ceed38149861a1e2f0712ec4c92d10f9d0c97e5b2a9997a094 WatchSource:0}: Error finding container 30374b7be49ab8ceed38149861a1e2f0712ec4c92d10f9d0c97e5b2a9997a094: Status 404 returned error can't find the container with id 30374b7be49ab8ceed38149861a1e2f0712ec4c92d10f9d0c97e5b2a9997a094 Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.975907 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4acbab1-e9c2-4c9f-a676-a7e7a3099114","Type":"ContainerStarted","Data":"87bbf521528dd3840cb1c5e70835a90bfa5920fed523c26e00e16412b4fbc576"} Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.992404 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.992384605 podStartE2EDuration="1.992384605s" podCreationTimestamp="2025-12-16 14:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:35.990636371 +0000 UTC m=+156.830815355" watchObservedRunningTime="2025-12-16 14:59:35.992384605 +0000 UTC m=+156.832563589" Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.998717 4728 generic.go:334] "Generic (PLEG): container finished" podID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" containerID="f4bc2a5c807793c97651ae3c5d100c260895ba259e71795f2f21c6c9826d9a96" exitCode=0 Dec 16 14:59:35 crc kubenswrapper[4728]: I1216 14:59:35.998778 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs9cj" event={"ID":"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa","Type":"ContainerDied","Data":"f4bc2a5c807793c97651ae3c5d100c260895ba259e71795f2f21c6c9826d9a96"} Dec 16 14:59:36 crc kubenswrapper[4728]: I1216 14:59:36.010722 4728 generic.go:334] "Generic (PLEG): container finished" podID="e74a33ea-23b7-47fc-a463-566f8b579917" containerID="9c78c4646e45d64fb8e623fad6ace1d43baf7101d4ea728abee59f2d1546e59d" exitCode=0 Dec 16 14:59:36 crc kubenswrapper[4728]: I1216 14:59:36.010826 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vw5z" event={"ID":"e74a33ea-23b7-47fc-a463-566f8b579917","Type":"ContainerDied","Data":"9c78c4646e45d64fb8e623fad6ace1d43baf7101d4ea728abee59f2d1546e59d"} Dec 16 14:59:36 crc kubenswrapper[4728]: I1216 14:59:36.010893 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vw5z" event={"ID":"e74a33ea-23b7-47fc-a463-566f8b579917","Type":"ContainerStarted","Data":"46699ea4e925835ddc8acfedf187576712037b7401ba4a42cc75b59e8e2648b2"} Dec 16 14:59:36 crc kubenswrapper[4728]: I1216 14:59:36.034239 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" Dec 16 14:59:36 crc kubenswrapper[4728]: I1216 14:59:36.034676 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj" event={"ID":"f0fa8699-bd81-4174-a0c1-6b9a3519ab0d","Type":"ContainerDied","Data":"24c413d23c5fc3afe4bee5462992ddd78f6f43bd723bbdd3f2c5a2bb9e7f3251"} Dec 16 14:59:36 crc kubenswrapper[4728]: I1216 14:59:36.035519 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24c413d23c5fc3afe4bee5462992ddd78f6f43bd723bbdd3f2c5a2bb9e7f3251" Dec 16 14:59:36 crc kubenswrapper[4728]: I1216 14:59:36.437445 4728 patch_prober.go:28] interesting pod/router-default-5444994796-55pkk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:59:36 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Dec 16 14:59:36 crc kubenswrapper[4728]: [+]process-running ok Dec 16 14:59:36 crc kubenswrapper[4728]: healthz check failed Dec 16 14:59:36 crc kubenswrapper[4728]: I1216 14:59:36.437505 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55pkk" podUID="05c53e6d-0610-4944-a6af-1fdb84368f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.060466 4728 generic.go:334] "Generic (PLEG): container finished" podID="b722868c-334c-4e92-a919-051140c48283" containerID="2d7de4e3b602ffeb1d763eee72dca98a9f2ba04f61fb4709bae6ae8da3450b87" exitCode=0 Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.060576 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdgm" event={"ID":"b722868c-334c-4e92-a919-051140c48283","Type":"ContainerDied","Data":"2d7de4e3b602ffeb1d763eee72dca98a9f2ba04f61fb4709bae6ae8da3450b87"} Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.062748 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdgm" event={"ID":"b722868c-334c-4e92-a919-051140c48283","Type":"ContainerStarted","Data":"30374b7be49ab8ceed38149861a1e2f0712ec4c92d10f9d0c97e5b2a9997a094"} Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.074995 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4acbab1-e9c2-4c9f-a676-a7e7a3099114" containerID="87bbf521528dd3840cb1c5e70835a90bfa5920fed523c26e00e16412b4fbc576" exitCode=0 Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.075043 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4acbab1-e9c2-4c9f-a676-a7e7a3099114","Type":"ContainerDied","Data":"87bbf521528dd3840cb1c5e70835a90bfa5920fed523c26e00e16412b4fbc576"} Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.360158 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-m9d45" Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.423790 4728 patch_prober.go:28] interesting pod/router-default-5444994796-55pkk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 16 14:59:37 crc kubenswrapper[4728]: [-]has-synced failed: reason withheld Dec 16 14:59:37 crc kubenswrapper[4728]: [+]process-running ok Dec 16 14:59:37 crc kubenswrapper[4728]: healthz check failed Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.424106 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-55pkk" podUID="05c53e6d-0610-4944-a6af-1fdb84368f05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.922520 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 14:59:37 crc kubenswrapper[4728]: E1216 14:59:37.922862 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0fa8699-bd81-4174-a0c1-6b9a3519ab0d" containerName="collect-profiles" Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.922876 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0fa8699-bd81-4174-a0c1-6b9a3519ab0d" containerName="collect-profiles" Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.922975 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0fa8699-bd81-4174-a0c1-6b9a3519ab0d" containerName="collect-profiles" Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.927217 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.927327 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.935268 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.935600 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.999081 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0e5cdb1-c973-48db-b600-f55b623b79a4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d0e5cdb1-c973-48db-b600-f55b623b79a4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:59:37 crc kubenswrapper[4728]: I1216 14:59:37.999227 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0e5cdb1-c973-48db-b600-f55b623b79a4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d0e5cdb1-c973-48db-b600-f55b623b79a4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.100822 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0e5cdb1-c973-48db-b600-f55b623b79a4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d0e5cdb1-c973-48db-b600-f55b623b79a4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.100942 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0e5cdb1-c973-48db-b600-f55b623b79a4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d0e5cdb1-c973-48db-b600-f55b623b79a4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.101027 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0e5cdb1-c973-48db-b600-f55b623b79a4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d0e5cdb1-c973-48db-b600-f55b623b79a4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.132709 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0e5cdb1-c973-48db-b600-f55b623b79a4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d0e5cdb1-c973-48db-b600-f55b623b79a4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.255653 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.344774 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.426535 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.431881 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-55pkk" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.506251 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kubelet-dir\") pod \"f4acbab1-e9c2-4c9f-a676-a7e7a3099114\" (UID: \"f4acbab1-e9c2-4c9f-a676-a7e7a3099114\") " Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.506299 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kube-api-access\") pod \"f4acbab1-e9c2-4c9f-a676-a7e7a3099114\" (UID: \"f4acbab1-e9c2-4c9f-a676-a7e7a3099114\") " Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.506516 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f4acbab1-e9c2-4c9f-a676-a7e7a3099114" (UID: "f4acbab1-e9c2-4c9f-a676-a7e7a3099114"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.506883 4728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.515163 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f4acbab1-e9c2-4c9f-a676-a7e7a3099114" (UID: "f4acbab1-e9c2-4c9f-a676-a7e7a3099114"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.608241 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4acbab1-e9c2-4c9f-a676-a7e7a3099114-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.768020 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.819347 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 14:59:38 crc kubenswrapper[4728]: I1216 14:59:38.819441 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 14:59:39 crc kubenswrapper[4728]: I1216 14:59:39.091647 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4acbab1-e9c2-4c9f-a676-a7e7a3099114","Type":"ContainerDied","Data":"f8cb16dbce50e6c735ca3ea6de8230c8010595a99959abc500e2449b227699a1"} Dec 16 14:59:39 crc kubenswrapper[4728]: I1216 14:59:39.091680 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 16 14:59:39 crc kubenswrapper[4728]: I1216 14:59:39.091687 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8cb16dbce50e6c735ca3ea6de8230c8010595a99959abc500e2449b227699a1" Dec 16 14:59:39 crc kubenswrapper[4728]: I1216 14:59:39.093017 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d0e5cdb1-c973-48db-b600-f55b623b79a4","Type":"ContainerStarted","Data":"3eb6c5c825818abb3c9c3eb166f15ba76fd85768eb58777dea61976ab675097f"} Dec 16 14:59:40 crc kubenswrapper[4728]: I1216 14:59:40.100001 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d0e5cdb1-c973-48db-b600-f55b623b79a4","Type":"ContainerStarted","Data":"5c2583fed9f562ab0e63203a81d852faac02c2a65049ee4799252a7db7783fdb"} Dec 16 14:59:40 crc kubenswrapper[4728]: I1216 14:59:40.113192 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.113172848 podStartE2EDuration="3.113172848s" podCreationTimestamp="2025-12-16 14:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 14:59:40.111791273 +0000 UTC m=+160.951970257" watchObservedRunningTime="2025-12-16 14:59:40.113172848 +0000 UTC m=+160.953351822" Dec 16 14:59:41 crc kubenswrapper[4728]: I1216 14:59:41.111370 4728 generic.go:334] "Generic (PLEG): container finished" podID="d0e5cdb1-c973-48db-b600-f55b623b79a4" containerID="5c2583fed9f562ab0e63203a81d852faac02c2a65049ee4799252a7db7783fdb" exitCode=0 Dec 16 14:59:41 crc kubenswrapper[4728]: I1216 14:59:41.111467 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d0e5cdb1-c973-48db-b600-f55b623b79a4","Type":"ContainerDied","Data":"5c2583fed9f562ab0e63203a81d852faac02c2a65049ee4799252a7db7783fdb"} Dec 16 14:59:44 crc kubenswrapper[4728]: I1216 14:59:44.672564 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-x6b6q" Dec 16 14:59:45 crc kubenswrapper[4728]: I1216 14:59:45.541703 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:45 crc kubenswrapper[4728]: I1216 14:59:45.545623 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 14:59:46 crc kubenswrapper[4728]: I1216 14:59:46.069743 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:46 crc kubenswrapper[4728]: I1216 14:59:46.078446 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d13ff897-af48-416f-ba3f-44f7e4344a75-metrics-certs\") pod \"network-metrics-daemon-kjxbh\" (UID: \"d13ff897-af48-416f-ba3f-44f7e4344a75\") " pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:46 crc kubenswrapper[4728]: I1216 14:59:46.253540 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kjxbh" Dec 16 14:59:47 crc kubenswrapper[4728]: I1216 14:59:47.365440 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hnrqv"] Dec 16 14:59:47 crc kubenswrapper[4728]: I1216 14:59:47.365702 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" podUID="fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" containerName="controller-manager" containerID="cri-o://c5e41b060f63c748f410fa7ca33877b2198592217a99c826aa275dcba0f82979" gracePeriod=30 Dec 16 14:59:47 crc kubenswrapper[4728]: I1216 14:59:47.391099 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb"] Dec 16 14:59:47 crc kubenswrapper[4728]: I1216 14:59:47.391589 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" podUID="2555bf9d-eb8f-4c07-b78a-8aec21d16b75" containerName="route-controller-manager" containerID="cri-o://882c237e90ea1facfabc7e0d0e3a15400c3cd5ba9445b0b94e0df82d3e1a903a" gracePeriod=30 Dec 16 14:59:50 crc kubenswrapper[4728]: I1216 14:59:50.187155 4728 generic.go:334] "Generic (PLEG): container finished" podID="2555bf9d-eb8f-4c07-b78a-8aec21d16b75" containerID="882c237e90ea1facfabc7e0d0e3a15400c3cd5ba9445b0b94e0df82d3e1a903a" exitCode=0 Dec 16 14:59:50 crc kubenswrapper[4728]: I1216 14:59:50.187289 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" event={"ID":"2555bf9d-eb8f-4c07-b78a-8aec21d16b75","Type":"ContainerDied","Data":"882c237e90ea1facfabc7e0d0e3a15400c3cd5ba9445b0b94e0df82d3e1a903a"} Dec 16 14:59:50 crc kubenswrapper[4728]: I1216 14:59:50.190919 4728 generic.go:334] "Generic (PLEG): container finished" podID="fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" containerID="c5e41b060f63c748f410fa7ca33877b2198592217a99c826aa275dcba0f82979" exitCode=0 Dec 16 14:59:50 crc kubenswrapper[4728]: I1216 14:59:50.190988 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" event={"ID":"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf","Type":"ContainerDied","Data":"c5e41b060f63c748f410fa7ca33877b2198592217a99c826aa275dcba0f82979"} Dec 16 14:59:53 crc kubenswrapper[4728]: I1216 14:59:53.051011 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 14:59:54 crc kubenswrapper[4728]: I1216 14:59:54.309876 4728 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-jfcxb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 16 14:59:54 crc kubenswrapper[4728]: I1216 14:59:54.309961 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" podUID="2555bf9d-eb8f-4c07-b78a-8aec21d16b75" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 16 14:59:54 crc kubenswrapper[4728]: I1216 14:59:54.919909 4728 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hnrqv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 16 14:59:54 crc kubenswrapper[4728]: I1216 14:59:54.919980 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" podUID="fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.150953 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8"] Dec 16 15:00:00 crc kubenswrapper[4728]: E1216 15:00:00.151979 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4acbab1-e9c2-4c9f-a676-a7e7a3099114" containerName="pruner" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.152008 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4acbab1-e9c2-4c9f-a676-a7e7a3099114" containerName="pruner" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.152208 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4acbab1-e9c2-4c9f-a676-a7e7a3099114" containerName="pruner" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.152791 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.155910 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.155932 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.180816 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8"] Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.299230 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-config-volume\") pod \"collect-profiles-29431620-sv6v8\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.299371 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzk2j\" (UniqueName: \"kubernetes.io/projected/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-kube-api-access-gzk2j\") pod \"collect-profiles-29431620-sv6v8\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.299435 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-secret-volume\") pod \"collect-profiles-29431620-sv6v8\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.401105 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-config-volume\") pod \"collect-profiles-29431620-sv6v8\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.401195 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzk2j\" (UniqueName: \"kubernetes.io/projected/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-kube-api-access-gzk2j\") pod \"collect-profiles-29431620-sv6v8\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.401237 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-secret-volume\") pod \"collect-profiles-29431620-sv6v8\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.404233 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-config-volume\") pod \"collect-profiles-29431620-sv6v8\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.407627 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-secret-volume\") pod \"collect-profiles-29431620-sv6v8\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.425800 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzk2j\" (UniqueName: \"kubernetes.io/projected/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-kube-api-access-gzk2j\") pod \"collect-profiles-29431620-sv6v8\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:00 crc kubenswrapper[4728]: I1216 15:00:00.525142 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:02 crc kubenswrapper[4728]: I1216 15:00:02.917322 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 15:00:02 crc kubenswrapper[4728]: I1216 15:00:02.924164 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.041637 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0e5cdb1-c973-48db-b600-f55b623b79a4-kubelet-dir\") pod \"d0e5cdb1-c973-48db-b600-f55b623b79a4\" (UID: \"d0e5cdb1-c973-48db-b600-f55b623b79a4\") " Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.041906 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-config\") pod \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.041933 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-serving-cert\") pod \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.041947 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-proxy-ca-bundles\") pod \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.041982 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-client-ca\") pod \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.042015 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkn7q\" (UniqueName: \"kubernetes.io/projected/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-kube-api-access-wkn7q\") pod \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\" (UID: \"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf\") " Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.042039 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0e5cdb1-c973-48db-b600-f55b623b79a4-kube-api-access\") pod \"d0e5cdb1-c973-48db-b600-f55b623b79a4\" (UID: \"d0e5cdb1-c973-48db-b600-f55b623b79a4\") " Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.041779 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0e5cdb1-c973-48db-b600-f55b623b79a4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d0e5cdb1-c973-48db-b600-f55b623b79a4" (UID: "d0e5cdb1-c973-48db-b600-f55b623b79a4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.043129 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" (UID: "fe87da2c-8dce-4856-9fff-fdb78f7c4dcf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.043206 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-client-ca" (OuterVolumeSpecName: "client-ca") pod "fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" (UID: "fe87da2c-8dce-4856-9fff-fdb78f7c4dcf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.043258 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-config" (OuterVolumeSpecName: "config") pod "fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" (UID: "fe87da2c-8dce-4856-9fff-fdb78f7c4dcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.050031 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-kube-api-access-wkn7q" (OuterVolumeSpecName: "kube-api-access-wkn7q") pod "fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" (UID: "fe87da2c-8dce-4856-9fff-fdb78f7c4dcf"). InnerVolumeSpecName "kube-api-access-wkn7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.055954 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e5cdb1-c973-48db-b600-f55b623b79a4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d0e5cdb1-c973-48db-b600-f55b623b79a4" (UID: "d0e5cdb1-c973-48db-b600-f55b623b79a4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.060266 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" (UID: "fe87da2c-8dce-4856-9fff-fdb78f7c4dcf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.143958 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.144005 4728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0e5cdb1-c973-48db-b600-f55b623b79a4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.144016 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.144024 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.144037 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.144047 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkn7q\" (UniqueName: \"kubernetes.io/projected/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf-kube-api-access-wkn7q\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.144058 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0e5cdb1-c973-48db-b600-f55b623b79a4-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.269011 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" event={"ID":"fe87da2c-8dce-4856-9fff-fdb78f7c4dcf","Type":"ContainerDied","Data":"52040c599a95bb3a936989fab87ff71c5419f13ddb3d02f5806cc8fde247aba2"} Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.269052 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hnrqv" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.269092 4728 scope.go:117] "RemoveContainer" containerID="c5e41b060f63c748f410fa7ca33877b2198592217a99c826aa275dcba0f82979" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.271903 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d0e5cdb1-c973-48db-b600-f55b623b79a4","Type":"ContainerDied","Data":"3eb6c5c825818abb3c9c3eb166f15ba76fd85768eb58777dea61976ab675097f"} Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.271953 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eb6c5c825818abb3c9c3eb166f15ba76fd85768eb58777dea61976ab675097f" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.272019 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.308236 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hnrqv"] Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.311788 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hnrqv"] Dec 16 15:00:03 crc kubenswrapper[4728]: I1216 15:00:03.517001 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" path="/var/lib/kubelet/pods/fe87da2c-8dce-4856-9fff-fdb78f7c4dcf/volumes" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.006191 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-68vs6" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.228601 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74c887975f-zqtn9"] Dec 16 15:00:05 crc kubenswrapper[4728]: E1216 15:00:05.228904 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e5cdb1-c973-48db-b600-f55b623b79a4" containerName="pruner" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.228928 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e5cdb1-c973-48db-b600-f55b623b79a4" containerName="pruner" Dec 16 15:00:05 crc kubenswrapper[4728]: E1216 15:00:05.228948 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" containerName="controller-manager" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.228960 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" containerName="controller-manager" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.229107 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e5cdb1-c973-48db-b600-f55b623b79a4" containerName="pruner" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.229140 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe87da2c-8dce-4856-9fff-fdb78f7c4dcf" containerName="controller-manager" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.229740 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.235757 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.235916 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.236774 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.237117 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.237293 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.237678 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.239344 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74c887975f-zqtn9"] Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.255763 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.309818 4728 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-jfcxb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: i/o timeout (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.309886 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" podUID="2555bf9d-eb8f-4c07-b78a-8aec21d16b75" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.373334 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnkbt\" (UniqueName: \"kubernetes.io/projected/0a7e62a1-e81b-402d-99af-c439c008405a-kube-api-access-pnkbt\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.373478 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-client-ca\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.373531 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-config\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.373561 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-proxy-ca-bundles\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.373586 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a7e62a1-e81b-402d-99af-c439c008405a-serving-cert\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.474497 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnkbt\" (UniqueName: \"kubernetes.io/projected/0a7e62a1-e81b-402d-99af-c439c008405a-kube-api-access-pnkbt\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.474842 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-client-ca\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.474892 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-config\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.474931 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-proxy-ca-bundles\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.474971 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a7e62a1-e81b-402d-99af-c439c008405a-serving-cert\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.476521 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-config\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.476777 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-proxy-ca-bundles\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.477331 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-client-ca\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.480783 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a7e62a1-e81b-402d-99af-c439c008405a-serving-cert\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.492275 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnkbt\" (UniqueName: \"kubernetes.io/projected/0a7e62a1-e81b-402d-99af-c439c008405a-kube-api-access-pnkbt\") pod \"controller-manager-74c887975f-zqtn9\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.554610 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.853376 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.983177 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-config\") pod \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.983476 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bv4p\" (UniqueName: \"kubernetes.io/projected/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-kube-api-access-7bv4p\") pod \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.983528 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-serving-cert\") pod \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.983642 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-client-ca\") pod \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\" (UID: \"2555bf9d-eb8f-4c07-b78a-8aec21d16b75\") " Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.984516 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-client-ca" (OuterVolumeSpecName: "client-ca") pod "2555bf9d-eb8f-4c07-b78a-8aec21d16b75" (UID: "2555bf9d-eb8f-4c07-b78a-8aec21d16b75"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.985908 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-config" (OuterVolumeSpecName: "config") pod "2555bf9d-eb8f-4c07-b78a-8aec21d16b75" (UID: "2555bf9d-eb8f-4c07-b78a-8aec21d16b75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.998536 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2555bf9d-eb8f-4c07-b78a-8aec21d16b75" (UID: "2555bf9d-eb8f-4c07-b78a-8aec21d16b75"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:05 crc kubenswrapper[4728]: I1216 15:00:05.999712 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-kube-api-access-7bv4p" (OuterVolumeSpecName: "kube-api-access-7bv4p") pod "2555bf9d-eb8f-4c07-b78a-8aec21d16b75" (UID: "2555bf9d-eb8f-4c07-b78a-8aec21d16b75"). InnerVolumeSpecName "kube-api-access-7bv4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.085635 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.085674 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.085688 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bv4p\" (UniqueName: \"kubernetes.io/projected/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-kube-api-access-7bv4p\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.085699 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2555bf9d-eb8f-4c07-b78a-8aec21d16b75-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.288687 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" event={"ID":"2555bf9d-eb8f-4c07-b78a-8aec21d16b75","Type":"ContainerDied","Data":"af3907fb59a249bceefca1f6bb2d89bbd71e11b94a029843613d620382976431"} Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.288746 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.318987 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb"] Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.324062 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jfcxb"] Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.617064 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74c887975f-zqtn9"] Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.736041 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62"] Dec 16 15:00:06 crc kubenswrapper[4728]: E1216 15:00:06.736234 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2555bf9d-eb8f-4c07-b78a-8aec21d16b75" containerName="route-controller-manager" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.736245 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2555bf9d-eb8f-4c07-b78a-8aec21d16b75" containerName="route-controller-manager" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.736342 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="2555bf9d-eb8f-4c07-b78a-8aec21d16b75" containerName="route-controller-manager" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.736693 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.737892 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.740253 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.740421 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.740484 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.740548 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.740625 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.752179 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62"] Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.774913 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.896140 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmj99\" (UniqueName: \"kubernetes.io/projected/68e0932b-b003-420f-b476-f161a5b5193f-kube-api-access-cmj99\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.896545 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68e0932b-b003-420f-b476-f161a5b5193f-serving-cert\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.896591 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-client-ca\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.896618 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-config\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.998683 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmj99\" (UniqueName: \"kubernetes.io/projected/68e0932b-b003-420f-b476-f161a5b5193f-kube-api-access-cmj99\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.998725 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68e0932b-b003-420f-b476-f161a5b5193f-serving-cert\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.998796 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-client-ca\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.999573 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-config\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:06 crc kubenswrapper[4728]: I1216 15:00:06.999933 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-client-ca\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:07 crc kubenswrapper[4728]: I1216 15:00:07.000511 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-config\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:07 crc kubenswrapper[4728]: I1216 15:00:07.002630 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68e0932b-b003-420f-b476-f161a5b5193f-serving-cert\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:07 crc kubenswrapper[4728]: I1216 15:00:07.021778 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmj99\" (UniqueName: \"kubernetes.io/projected/68e0932b-b003-420f-b476-f161a5b5193f-kube-api-access-cmj99\") pod \"route-controller-manager-6446b6f9d5-tbl62\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:07 crc kubenswrapper[4728]: I1216 15:00:07.052657 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:07 crc kubenswrapper[4728]: E1216 15:00:07.324769 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 16 15:00:07 crc kubenswrapper[4728]: E1216 15:00:07.324933 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqwm5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bxlrz_openshift-marketplace(c89065be-d4d7-4201-b4fd-f1bc18df6a60): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 15:00:07 crc kubenswrapper[4728]: E1216 15:00:07.326749 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bxlrz" podUID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" Dec 16 15:00:07 crc kubenswrapper[4728]: I1216 15:00:07.511870 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2555bf9d-eb8f-4c07-b78a-8aec21d16b75" path="/var/lib/kubelet/pods/2555bf9d-eb8f-4c07-b78a-8aec21d16b75/volumes" Dec 16 15:00:08 crc kubenswrapper[4728]: I1216 15:00:08.819458 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:00:08 crc kubenswrapper[4728]: I1216 15:00:08.819843 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:00:10 crc kubenswrapper[4728]: E1216 15:00:10.113442 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bxlrz" podUID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" Dec 16 15:00:10 crc kubenswrapper[4728]: E1216 15:00:10.178185 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 16 15:00:10 crc kubenswrapper[4728]: E1216 15:00:10.178329 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f5fr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fs9cj_openshift-marketplace(2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 15:00:10 crc kubenswrapper[4728]: E1216 15:00:10.179930 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fs9cj" podUID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" Dec 16 15:00:10 crc kubenswrapper[4728]: E1216 15:00:10.242971 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 16 15:00:10 crc kubenswrapper[4728]: E1216 15:00:10.243116 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pbxdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-6vw5z_openshift-marketplace(e74a33ea-23b7-47fc-a463-566f8b579917): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 15:00:10 crc kubenswrapper[4728]: E1216 15:00:10.244451 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-6vw5z" podUID="e74a33ea-23b7-47fc-a463-566f8b579917" Dec 16 15:00:11 crc kubenswrapper[4728]: E1216 15:00:11.664270 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-6vw5z" podUID="e74a33ea-23b7-47fc-a463-566f8b579917" Dec 16 15:00:11 crc kubenswrapper[4728]: E1216 15:00:11.669704 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fs9cj" podUID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" Dec 16 15:00:11 crc kubenswrapper[4728]: E1216 15:00:11.739370 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 16 15:00:11 crc kubenswrapper[4728]: E1216 15:00:11.739580 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fffqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-78qzz_openshift-marketplace(d5541d34-e213-4545-af81-6410a52db88d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 15:00:11 crc kubenswrapper[4728]: E1216 15:00:11.741041 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-78qzz" podUID="d5541d34-e213-4545-af81-6410a52db88d" Dec 16 15:00:11 crc kubenswrapper[4728]: I1216 15:00:11.924939 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 15:00:11 crc kubenswrapper[4728]: I1216 15:00:11.925701 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 15:00:11 crc kubenswrapper[4728]: I1216 15:00:11.927163 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 16 15:00:11 crc kubenswrapper[4728]: I1216 15:00:11.928787 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 15:00:11 crc kubenswrapper[4728]: I1216 15:00:11.929667 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 16 15:00:12 crc kubenswrapper[4728]: I1216 15:00:12.077330 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4ec1a063-bbf0-47d5-aafe-75fc842ee69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 15:00:12 crc kubenswrapper[4728]: I1216 15:00:12.077397 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4ec1a063-bbf0-47d5-aafe-75fc842ee69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 15:00:12 crc kubenswrapper[4728]: I1216 15:00:12.178268 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4ec1a063-bbf0-47d5-aafe-75fc842ee69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 15:00:12 crc kubenswrapper[4728]: I1216 15:00:12.178342 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4ec1a063-bbf0-47d5-aafe-75fc842ee69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 15:00:12 crc kubenswrapper[4728]: I1216 15:00:12.178401 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4ec1a063-bbf0-47d5-aafe-75fc842ee69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 15:00:12 crc kubenswrapper[4728]: I1216 15:00:12.212740 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4ec1a063-bbf0-47d5-aafe-75fc842ee69f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 15:00:12 crc kubenswrapper[4728]: I1216 15:00:12.250916 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.193875 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-78qzz" podUID="d5541d34-e213-4545-af81-6410a52db88d" Dec 16 15:00:14 crc kubenswrapper[4728]: I1216 15:00:14.203050 4728 scope.go:117] "RemoveContainer" containerID="882c237e90ea1facfabc7e0d0e3a15400c3cd5ba9445b0b94e0df82d3e1a903a" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.270099 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.270278 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qtvx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rgppn_openshift-marketplace(c5d6795c-254f-428c-9fc2-c37b2e224b54): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.271491 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rgppn" podUID="c5d6795c-254f-428c-9fc2-c37b2e224b54" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.288179 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.288351 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgf4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9zk7d_openshift-marketplace(8b64cba1-1ce1-4715-bd9f-831d3db30fc2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.289580 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9zk7d" podUID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.332249 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.332378 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mvg5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hlwvp_openshift-marketplace(4dccc964-0fe8-499e-a852-d971100829c1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.333585 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hlwvp" podUID="4dccc964-0fe8-499e-a852-d971100829c1" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.354424 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9zk7d" podUID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.354453 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rgppn" podUID="c5d6795c-254f-428c-9fc2-c37b2e224b54" Dec 16 15:00:14 crc kubenswrapper[4728]: E1216 15:00:14.354498 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hlwvp" podUID="4dccc964-0fe8-499e-a852-d971100829c1" Dec 16 15:00:14 crc kubenswrapper[4728]: I1216 15:00:14.754094 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8"] Dec 16 15:00:14 crc kubenswrapper[4728]: W1216 15:00:14.763722 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79af2ee0_daef_4cca_ac1b_089d9e5be4ae.slice/crio-64d9b1615983bfc595926cd6616ccf14385520b2bae20e6fd9ee094b873e0608 WatchSource:0}: Error finding container 64d9b1615983bfc595926cd6616ccf14385520b2bae20e6fd9ee094b873e0608: Status 404 returned error can't find the container with id 64d9b1615983bfc595926cd6616ccf14385520b2bae20e6fd9ee094b873e0608 Dec 16 15:00:14 crc kubenswrapper[4728]: I1216 15:00:14.873961 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74c887975f-zqtn9"] Dec 16 15:00:14 crc kubenswrapper[4728]: I1216 15:00:14.878837 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kjxbh"] Dec 16 15:00:14 crc kubenswrapper[4728]: I1216 15:00:14.888650 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62"] Dec 16 15:00:14 crc kubenswrapper[4728]: I1216 15:00:14.894047 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 16 15:00:14 crc kubenswrapper[4728]: W1216 15:00:14.898693 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4ec1a063_bbf0_47d5_aafe_75fc842ee69f.slice/crio-7702ddc8f53fd3e23d7ece881ba5e665afaba4cb56911767733a74a6ed9f5a96 WatchSource:0}: Error finding container 7702ddc8f53fd3e23d7ece881ba5e665afaba4cb56911767733a74a6ed9f5a96: Status 404 returned error can't find the container with id 7702ddc8f53fd3e23d7ece881ba5e665afaba4cb56911767733a74a6ed9f5a96 Dec 16 15:00:15 crc kubenswrapper[4728]: I1216 15:00:15.358774 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4ec1a063-bbf0-47d5-aafe-75fc842ee69f","Type":"ContainerStarted","Data":"7702ddc8f53fd3e23d7ece881ba5e665afaba4cb56911767733a74a6ed9f5a96"} Dec 16 15:00:15 crc kubenswrapper[4728]: I1216 15:00:15.360654 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdgm" event={"ID":"b722868c-334c-4e92-a919-051140c48283","Type":"ContainerStarted","Data":"3b508f4b5e573abda9f40fbc51d31e7edc096fac27ef88cf0697e8edb24c337e"} Dec 16 15:00:15 crc kubenswrapper[4728]: I1216 15:00:15.362937 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" event={"ID":"79af2ee0-daef-4cca-ac1b-089d9e5be4ae","Type":"ContainerStarted","Data":"031a2d2e04ec331e2e85075abce071023cfa24f5e62f65467cc25018751e1b98"} Dec 16 15:00:15 crc kubenswrapper[4728]: I1216 15:00:15.362966 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" event={"ID":"79af2ee0-daef-4cca-ac1b-089d9e5be4ae","Type":"ContainerStarted","Data":"64d9b1615983bfc595926cd6616ccf14385520b2bae20e6fd9ee094b873e0608"} Dec 16 15:00:15 crc kubenswrapper[4728]: I1216 15:00:15.364075 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" event={"ID":"68e0932b-b003-420f-b476-f161a5b5193f","Type":"ContainerStarted","Data":"c23b0c3cb504707a4873ba8c038d9cdcc9dc9b542fe865285005b2da854862b6"} Dec 16 15:00:15 crc kubenswrapper[4728]: I1216 15:00:15.366498 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" event={"ID":"d13ff897-af48-416f-ba3f-44f7e4344a75","Type":"ContainerStarted","Data":"07b3947d80cda0da011ebacaec1687b82f1c189927609905f5b40010f3cd7ed3"} Dec 16 15:00:15 crc kubenswrapper[4728]: I1216 15:00:15.366537 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" event={"ID":"0a7e62a1-e81b-402d-99af-c439c008405a","Type":"ContainerStarted","Data":"8bbf55c10147abdc144aaace64bde498a9424d8ff18f9ca6ac07d0c83a73f320"} Dec 16 15:00:15 crc kubenswrapper[4728]: I1216 15:00:15.396072 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" podStartSLOduration=15.396003344 podStartE2EDuration="15.396003344s" podCreationTimestamp="2025-12-16 15:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:15.392241238 +0000 UTC m=+196.232420222" watchObservedRunningTime="2025-12-16 15:00:15.396003344 +0000 UTC m=+196.236182328" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.373671 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" event={"ID":"0a7e62a1-e81b-402d-99af-c439c008405a","Type":"ContainerStarted","Data":"4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956"} Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.374068 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.373776 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" podUID="0a7e62a1-e81b-402d-99af-c439c008405a" containerName="controller-manager" containerID="cri-o://4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956" gracePeriod=30 Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.376806 4728 generic.go:334] "Generic (PLEG): container finished" podID="4ec1a063-bbf0-47d5-aafe-75fc842ee69f" containerID="6499713297dbfb419192e94c34e14a5249df9889cdc1c3a71c8a3886378be089" exitCode=0 Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.376901 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4ec1a063-bbf0-47d5-aafe-75fc842ee69f","Type":"ContainerDied","Data":"6499713297dbfb419192e94c34e14a5249df9889cdc1c3a71c8a3886378be089"} Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.380272 4728 generic.go:334] "Generic (PLEG): container finished" podID="b722868c-334c-4e92-a919-051140c48283" containerID="3b508f4b5e573abda9f40fbc51d31e7edc096fac27ef88cf0697e8edb24c337e" exitCode=0 Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.380352 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdgm" event={"ID":"b722868c-334c-4e92-a919-051140c48283","Type":"ContainerDied","Data":"3b508f4b5e573abda9f40fbc51d31e7edc096fac27ef88cf0697e8edb24c337e"} Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.380485 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.383094 4728 generic.go:334] "Generic (PLEG): container finished" podID="79af2ee0-daef-4cca-ac1b-089d9e5be4ae" containerID="031a2d2e04ec331e2e85075abce071023cfa24f5e62f65467cc25018751e1b98" exitCode=0 Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.383156 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" event={"ID":"79af2ee0-daef-4cca-ac1b-089d9e5be4ae","Type":"ContainerDied","Data":"031a2d2e04ec331e2e85075abce071023cfa24f5e62f65467cc25018751e1b98"} Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.385369 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" event={"ID":"68e0932b-b003-420f-b476-f161a5b5193f","Type":"ContainerStarted","Data":"edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12"} Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.385661 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.387499 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" event={"ID":"d13ff897-af48-416f-ba3f-44f7e4344a75","Type":"ContainerStarted","Data":"1980f757599f7a604283f8dd44a23e144a93856c1fdbb263c4ccc809bdebf843"} Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.387528 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kjxbh" event={"ID":"d13ff897-af48-416f-ba3f-44f7e4344a75","Type":"ContainerStarted","Data":"af795aa20372e0de16be4cfa1b0f8049b796a4ab464fac25a0cad171d79f332d"} Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.390827 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.394986 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" podStartSLOduration=29.394958623 podStartE2EDuration="29.394958623s" podCreationTimestamp="2025-12-16 14:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:16.389646979 +0000 UTC m=+197.229825963" watchObservedRunningTime="2025-12-16 15:00:16.394958623 +0000 UTC m=+197.235137607" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.405950 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" podStartSLOduration=10.405933509 podStartE2EDuration="10.405933509s" podCreationTimestamp="2025-12-16 15:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:16.405894948 +0000 UTC m=+197.246073952" watchObservedRunningTime="2025-12-16 15:00:16.405933509 +0000 UTC m=+197.246112493" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.421097 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kjxbh" podStartSLOduration=173.42106661 podStartE2EDuration="2m53.42106661s" podCreationTimestamp="2025-12-16 14:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:16.420071715 +0000 UTC m=+197.260250699" watchObservedRunningTime="2025-12-16 15:00:16.42106661 +0000 UTC m=+197.261245594" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.712119 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.738256 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-dd799fbcb-vr2rr"] Dec 16 15:00:16 crc kubenswrapper[4728]: E1216 15:00:16.738915 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7e62a1-e81b-402d-99af-c439c008405a" containerName="controller-manager" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.738928 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7e62a1-e81b-402d-99af-c439c008405a" containerName="controller-manager" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.739022 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7e62a1-e81b-402d-99af-c439c008405a" containerName="controller-manager" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.739451 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.751726 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dd799fbcb-vr2rr"] Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.863780 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a7e62a1-e81b-402d-99af-c439c008405a-serving-cert\") pod \"0a7e62a1-e81b-402d-99af-c439c008405a\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.863831 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-client-ca\") pod \"0a7e62a1-e81b-402d-99af-c439c008405a\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.863856 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-proxy-ca-bundles\") pod \"0a7e62a1-e81b-402d-99af-c439c008405a\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.863904 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnkbt\" (UniqueName: \"kubernetes.io/projected/0a7e62a1-e81b-402d-99af-c439c008405a-kube-api-access-pnkbt\") pod \"0a7e62a1-e81b-402d-99af-c439c008405a\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.864270 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-config\") pod \"0a7e62a1-e81b-402d-99af-c439c008405a\" (UID: \"0a7e62a1-e81b-402d-99af-c439c008405a\") " Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.864455 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-proxy-ca-bundles\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.864520 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-client-ca\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.864556 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-serving-cert\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.864619 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-config\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.864672 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2clvk\" (UniqueName: \"kubernetes.io/projected/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-kube-api-access-2clvk\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.865186 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-client-ca" (OuterVolumeSpecName: "client-ca") pod "0a7e62a1-e81b-402d-99af-c439c008405a" (UID: "0a7e62a1-e81b-402d-99af-c439c008405a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.865296 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-config" (OuterVolumeSpecName: "config") pod "0a7e62a1-e81b-402d-99af-c439c008405a" (UID: "0a7e62a1-e81b-402d-99af-c439c008405a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.865541 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0a7e62a1-e81b-402d-99af-c439c008405a" (UID: "0a7e62a1-e81b-402d-99af-c439c008405a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.871517 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7e62a1-e81b-402d-99af-c439c008405a-kube-api-access-pnkbt" (OuterVolumeSpecName: "kube-api-access-pnkbt") pod "0a7e62a1-e81b-402d-99af-c439c008405a" (UID: "0a7e62a1-e81b-402d-99af-c439c008405a"). InnerVolumeSpecName "kube-api-access-pnkbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.871743 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7e62a1-e81b-402d-99af-c439c008405a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0a7e62a1-e81b-402d-99af-c439c008405a" (UID: "0a7e62a1-e81b-402d-99af-c439c008405a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.965563 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-config\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.965666 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2clvk\" (UniqueName: \"kubernetes.io/projected/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-kube-api-access-2clvk\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.965737 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-proxy-ca-bundles\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.965795 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-client-ca\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.965859 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-serving-cert\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.965972 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.965993 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a7e62a1-e81b-402d-99af-c439c008405a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.966013 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.966033 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a7e62a1-e81b-402d-99af-c439c008405a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.966053 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnkbt\" (UniqueName: \"kubernetes.io/projected/0a7e62a1-e81b-402d-99af-c439c008405a-kube-api-access-pnkbt\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.966791 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-client-ca\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.966950 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-config\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.967928 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-proxy-ca-bundles\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.971820 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-serving-cert\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:16 crc kubenswrapper[4728]: I1216 15:00:16.987175 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2clvk\" (UniqueName: \"kubernetes.io/projected/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-kube-api-access-2clvk\") pod \"controller-manager-dd799fbcb-vr2rr\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.065166 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.276426 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dd799fbcb-vr2rr"] Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.394922 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdgm" event={"ID":"b722868c-334c-4e92-a919-051140c48283","Type":"ContainerStarted","Data":"e18df7954027147ab0bb73754629f6cd84b0858d96f2f7c69c11aee60607c3b9"} Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.396662 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" event={"ID":"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf","Type":"ContainerStarted","Data":"c6f5d0966b1019af6eac0bb1edbfb58774a12e6c6a1fb3c3b55331fdb88386e8"} Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.398043 4728 generic.go:334] "Generic (PLEG): container finished" podID="0a7e62a1-e81b-402d-99af-c439c008405a" containerID="4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956" exitCode=0 Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.398082 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.398078 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" event={"ID":"0a7e62a1-e81b-402d-99af-c439c008405a","Type":"ContainerDied","Data":"4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956"} Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.398139 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74c887975f-zqtn9" event={"ID":"0a7e62a1-e81b-402d-99af-c439c008405a","Type":"ContainerDied","Data":"8bbf55c10147abdc144aaace64bde498a9424d8ff18f9ca6ac07d0c83a73f320"} Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.398157 4728 scope.go:117] "RemoveContainer" containerID="4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.413997 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5fdgm" podStartSLOduration=3.530007631 podStartE2EDuration="43.413978408s" podCreationTimestamp="2025-12-16 14:59:34 +0000 UTC" firstStartedPulling="2025-12-16 14:59:37.068256679 +0000 UTC m=+157.908435663" lastFinishedPulling="2025-12-16 15:00:16.952227456 +0000 UTC m=+197.792406440" observedRunningTime="2025-12-16 15:00:17.408759437 +0000 UTC m=+198.248938421" watchObservedRunningTime="2025-12-16 15:00:17.413978408 +0000 UTC m=+198.254157392" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.425280 4728 scope.go:117] "RemoveContainer" containerID="4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956" Dec 16 15:00:17 crc kubenswrapper[4728]: E1216 15:00:17.426147 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956\": container with ID starting with 4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956 not found: ID does not exist" containerID="4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.426175 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956"} err="failed to get container status \"4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956\": rpc error: code = NotFound desc = could not find container \"4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956\": container with ID starting with 4d0eb46dce2d84c802e93d9b1a0a04a5f5a6a826634997801881a0dd3cb29956 not found: ID does not exist" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.438451 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74c887975f-zqtn9"] Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.441035 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74c887975f-zqtn9"] Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.525775 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7e62a1-e81b-402d-99af-c439c008405a" path="/var/lib/kubelet/pods/0a7e62a1-e81b-402d-99af-c439c008405a/volumes" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.678779 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.687113 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.879672 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-secret-volume\") pod \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.879725 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-config-volume\") pod \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.879774 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kube-api-access\") pod \"4ec1a063-bbf0-47d5-aafe-75fc842ee69f\" (UID: \"4ec1a063-bbf0-47d5-aafe-75fc842ee69f\") " Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.879801 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kubelet-dir\") pod \"4ec1a063-bbf0-47d5-aafe-75fc842ee69f\" (UID: \"4ec1a063-bbf0-47d5-aafe-75fc842ee69f\") " Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.879819 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzk2j\" (UniqueName: \"kubernetes.io/projected/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-kube-api-access-gzk2j\") pod \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\" (UID: \"79af2ee0-daef-4cca-ac1b-089d9e5be4ae\") " Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.880188 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4ec1a063-bbf0-47d5-aafe-75fc842ee69f" (UID: "4ec1a063-bbf0-47d5-aafe-75fc842ee69f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.880548 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-config-volume" (OuterVolumeSpecName: "config-volume") pod "79af2ee0-daef-4cca-ac1b-089d9e5be4ae" (UID: "79af2ee0-daef-4cca-ac1b-089d9e5be4ae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.885002 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-kube-api-access-gzk2j" (OuterVolumeSpecName: "kube-api-access-gzk2j") pod "79af2ee0-daef-4cca-ac1b-089d9e5be4ae" (UID: "79af2ee0-daef-4cca-ac1b-089d9e5be4ae"). InnerVolumeSpecName "kube-api-access-gzk2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.885103 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4ec1a063-bbf0-47d5-aafe-75fc842ee69f" (UID: "4ec1a063-bbf0-47d5-aafe-75fc842ee69f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.886650 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "79af2ee0-daef-4cca-ac1b-089d9e5be4ae" (UID: "79af2ee0-daef-4cca-ac1b-089d9e5be4ae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.980976 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.981023 4728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ec1a063-bbf0-47d5-aafe-75fc842ee69f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.981068 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzk2j\" (UniqueName: \"kubernetes.io/projected/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-kube-api-access-gzk2j\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.981084 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:17 crc kubenswrapper[4728]: I1216 15:00:17.981099 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79af2ee0-daef-4cca-ac1b-089d9e5be4ae-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:18 crc kubenswrapper[4728]: I1216 15:00:18.404394 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" event={"ID":"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf","Type":"ContainerStarted","Data":"3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881"} Dec 16 15:00:18 crc kubenswrapper[4728]: I1216 15:00:18.404738 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:18 crc kubenswrapper[4728]: I1216 15:00:18.410378 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:18 crc kubenswrapper[4728]: I1216 15:00:18.411183 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4ec1a063-bbf0-47d5-aafe-75fc842ee69f","Type":"ContainerDied","Data":"7702ddc8f53fd3e23d7ece881ba5e665afaba4cb56911767733a74a6ed9f5a96"} Dec 16 15:00:18 crc kubenswrapper[4728]: I1216 15:00:18.411214 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7702ddc8f53fd3e23d7ece881ba5e665afaba4cb56911767733a74a6ed9f5a96" Dec 16 15:00:18 crc kubenswrapper[4728]: I1216 15:00:18.411259 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 16 15:00:18 crc kubenswrapper[4728]: I1216 15:00:18.417634 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" Dec 16 15:00:18 crc kubenswrapper[4728]: I1216 15:00:18.433347 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8" event={"ID":"79af2ee0-daef-4cca-ac1b-089d9e5be4ae","Type":"ContainerDied","Data":"64d9b1615983bfc595926cd6616ccf14385520b2bae20e6fd9ee094b873e0608"} Dec 16 15:00:18 crc kubenswrapper[4728]: I1216 15:00:18.433381 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64d9b1615983bfc595926cd6616ccf14385520b2bae20e6fd9ee094b873e0608" Dec 16 15:00:18 crc kubenswrapper[4728]: I1216 15:00:18.465656 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" podStartSLOduration=12.465620673 podStartE2EDuration="12.465620673s" podCreationTimestamp="2025-12-16 15:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:18.432958692 +0000 UTC m=+199.273137676" watchObservedRunningTime="2025-12-16 15:00:18.465620673 +0000 UTC m=+199.305799697" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.517234 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 15:00:19 crc kubenswrapper[4728]: E1216 15:00:19.517456 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79af2ee0-daef-4cca-ac1b-089d9e5be4ae" containerName="collect-profiles" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.517473 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="79af2ee0-daef-4cca-ac1b-089d9e5be4ae" containerName="collect-profiles" Dec 16 15:00:19 crc kubenswrapper[4728]: E1216 15:00:19.517495 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec1a063-bbf0-47d5-aafe-75fc842ee69f" containerName="pruner" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.517503 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec1a063-bbf0-47d5-aafe-75fc842ee69f" containerName="pruner" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.517668 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec1a063-bbf0-47d5-aafe-75fc842ee69f" containerName="pruner" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.517681 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="79af2ee0-daef-4cca-ac1b-089d9e5be4ae" containerName="collect-profiles" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.518176 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.519903 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.522038 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.522220 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.602885 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.602990 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-var-lock\") pod \"installer-9-crc\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.603031 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42661dad-bece-4f43-9621-9c04d54ecb5c-kube-api-access\") pod \"installer-9-crc\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.704979 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.705084 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-var-lock\") pod \"installer-9-crc\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.705164 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-var-lock\") pod \"installer-9-crc\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.705165 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.705221 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42661dad-bece-4f43-9621-9c04d54ecb5c-kube-api-access\") pod \"installer-9-crc\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.724886 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42661dad-bece-4f43-9621-9c04d54ecb5c-kube-api-access\") pod \"installer-9-crc\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:00:19 crc kubenswrapper[4728]: I1216 15:00:19.837189 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:00:20 crc kubenswrapper[4728]: I1216 15:00:20.021282 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 16 15:00:20 crc kubenswrapper[4728]: W1216 15:00:20.034709 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod42661dad_bece_4f43_9621_9c04d54ecb5c.slice/crio-8ae96622b989611f0ad04c8f26e5cde0d718f621f50feeb65440004d60b5053b WatchSource:0}: Error finding container 8ae96622b989611f0ad04c8f26e5cde0d718f621f50feeb65440004d60b5053b: Status 404 returned error can't find the container with id 8ae96622b989611f0ad04c8f26e5cde0d718f621f50feeb65440004d60b5053b Dec 16 15:00:20 crc kubenswrapper[4728]: I1216 15:00:20.628727 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"42661dad-bece-4f43-9621-9c04d54ecb5c","Type":"ContainerStarted","Data":"8ae96622b989611f0ad04c8f26e5cde0d718f621f50feeb65440004d60b5053b"} Dec 16 15:00:21 crc kubenswrapper[4728]: I1216 15:00:21.634998 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"42661dad-bece-4f43-9621-9c04d54ecb5c","Type":"ContainerStarted","Data":"ef96257eca7490f60c46d116791afdb7b453659a8af1ddd04a380412763a291b"} Dec 16 15:00:21 crc kubenswrapper[4728]: I1216 15:00:21.658094 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.6580715010000002 podStartE2EDuration="2.658071501s" podCreationTimestamp="2025-12-16 15:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:21.648351438 +0000 UTC m=+202.488530432" watchObservedRunningTime="2025-12-16 15:00:21.658071501 +0000 UTC m=+202.498250505" Dec 16 15:00:25 crc kubenswrapper[4728]: I1216 15:00:25.400957 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 15:00:25 crc kubenswrapper[4728]: I1216 15:00:25.401627 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 15:00:25 crc kubenswrapper[4728]: I1216 15:00:25.654070 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 15:00:25 crc kubenswrapper[4728]: I1216 15:00:25.665451 4728 generic.go:334] "Generic (PLEG): container finished" podID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" containerID="f0c68431491303007bca2545a9558391c24c9cee44e80cd1bb80be3962852403" exitCode=0 Dec 16 15:00:25 crc kubenswrapper[4728]: I1216 15:00:25.666285 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxlrz" event={"ID":"c89065be-d4d7-4201-b4fd-f1bc18df6a60","Type":"ContainerDied","Data":"f0c68431491303007bca2545a9558391c24c9cee44e80cd1bb80be3962852403"} Dec 16 15:00:25 crc kubenswrapper[4728]: I1216 15:00:25.710718 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 15:00:26 crc kubenswrapper[4728]: I1216 15:00:26.632946 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dd799fbcb-vr2rr"] Dec 16 15:00:26 crc kubenswrapper[4728]: I1216 15:00:26.633397 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" podUID="7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" containerName="controller-manager" containerID="cri-o://3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881" gracePeriod=30 Dec 16 15:00:26 crc kubenswrapper[4728]: I1216 15:00:26.647507 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62"] Dec 16 15:00:26 crc kubenswrapper[4728]: I1216 15:00:26.647733 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" podUID="68e0932b-b003-420f-b476-f161a5b5193f" containerName="route-controller-manager" containerID="cri-o://edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12" gracePeriod=30 Dec 16 15:00:27 crc kubenswrapper[4728]: I1216 15:00:27.054544 4728 patch_prober.go:28] interesting pod/route-controller-manager-6446b6f9d5-tbl62 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Dec 16 15:00:27 crc kubenswrapper[4728]: I1216 15:00:27.063227 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" podUID="68e0932b-b003-420f-b476-f161a5b5193f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Dec 16 15:00:27 crc kubenswrapper[4728]: I1216 15:00:27.066027 4728 patch_prober.go:28] interesting pod/controller-manager-dd799fbcb-vr2rr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Dec 16 15:00:27 crc kubenswrapper[4728]: I1216 15:00:27.066077 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" podUID="7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Dec 16 15:00:27 crc kubenswrapper[4728]: I1216 15:00:27.745174 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fdgm"] Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.379429 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.384362 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.416538 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6774cbd74c-pjllj"] Dec 16 15:00:28 crc kubenswrapper[4728]: E1216 15:00:28.416819 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" containerName="controller-manager" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.416832 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" containerName="controller-manager" Dec 16 15:00:28 crc kubenswrapper[4728]: E1216 15:00:28.416847 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e0932b-b003-420f-b476-f161a5b5193f" containerName="route-controller-manager" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.416853 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e0932b-b003-420f-b476-f161a5b5193f" containerName="route-controller-manager" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.416953 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" containerName="controller-manager" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.416969 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="68e0932b-b003-420f-b476-f161a5b5193f" containerName="route-controller-manager" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.417374 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.429021 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6774cbd74c-pjllj"] Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.534071 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2clvk\" (UniqueName: \"kubernetes.io/projected/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-kube-api-access-2clvk\") pod \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.534134 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-client-ca\") pod \"68e0932b-b003-420f-b476-f161a5b5193f\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.534158 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-config\") pod \"68e0932b-b003-420f-b476-f161a5b5193f\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.534201 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-serving-cert\") pod \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.534227 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68e0932b-b003-420f-b476-f161a5b5193f-serving-cert\") pod \"68e0932b-b003-420f-b476-f161a5b5193f\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.534259 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-client-ca\") pod \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.534298 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-proxy-ca-bundles\") pod \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.534397 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-config\") pod \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\" (UID: \"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf\") " Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.534515 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmj99\" (UniqueName: \"kubernetes.io/projected/68e0932b-b003-420f-b476-f161a5b5193f-kube-api-access-cmj99\") pod \"68e0932b-b003-420f-b476-f161a5b5193f\" (UID: \"68e0932b-b003-420f-b476-f161a5b5193f\") " Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.535063 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-config\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.535137 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-proxy-ca-bundles\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.535200 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-config" (OuterVolumeSpecName: "config") pod "7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" (UID: "7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.535214 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlfkl\" (UniqueName: \"kubernetes.io/projected/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-kube-api-access-vlfkl\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.535295 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-serving-cert\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.535396 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-client-ca\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.535603 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.535755 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" (UID: "7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.535962 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-client-ca" (OuterVolumeSpecName: "client-ca") pod "7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" (UID: "7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.536738 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-config" (OuterVolumeSpecName: "config") pod "68e0932b-b003-420f-b476-f161a5b5193f" (UID: "68e0932b-b003-420f-b476-f161a5b5193f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.537182 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-client-ca" (OuterVolumeSpecName: "client-ca") pod "68e0932b-b003-420f-b476-f161a5b5193f" (UID: "68e0932b-b003-420f-b476-f161a5b5193f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.539741 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e0932b-b003-420f-b476-f161a5b5193f-kube-api-access-cmj99" (OuterVolumeSpecName: "kube-api-access-cmj99") pod "68e0932b-b003-420f-b476-f161a5b5193f" (UID: "68e0932b-b003-420f-b476-f161a5b5193f"). InnerVolumeSpecName "kube-api-access-cmj99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.539784 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-kube-api-access-2clvk" (OuterVolumeSpecName: "kube-api-access-2clvk") pod "7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" (UID: "7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf"). InnerVolumeSpecName "kube-api-access-2clvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.539875 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68e0932b-b003-420f-b476-f161a5b5193f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "68e0932b-b003-420f-b476-f161a5b5193f" (UID: "68e0932b-b003-420f-b476-f161a5b5193f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.540471 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" (UID: "7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637024 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-config\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637084 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-proxy-ca-bundles\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637116 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlfkl\" (UniqueName: \"kubernetes.io/projected/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-kube-api-access-vlfkl\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637150 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-serving-cert\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637197 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-client-ca\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637257 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68e0932b-b003-420f-b476-f161a5b5193f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637272 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637286 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637302 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmj99\" (UniqueName: \"kubernetes.io/projected/68e0932b-b003-420f-b476-f161a5b5193f-kube-api-access-cmj99\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637313 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2clvk\" (UniqueName: \"kubernetes.io/projected/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-kube-api-access-2clvk\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637325 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637338 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e0932b-b003-420f-b476-f161a5b5193f-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.637350 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.638501 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-client-ca\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.639613 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-proxy-ca-bundles\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.639677 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-config\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.646839 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-serving-cert\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.662178 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlfkl\" (UniqueName: \"kubernetes.io/projected/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-kube-api-access-vlfkl\") pod \"controller-manager-6774cbd74c-pjllj\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.682161 4728 generic.go:334] "Generic (PLEG): container finished" podID="7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" containerID="3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881" exitCode=0 Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.682223 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" event={"ID":"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf","Type":"ContainerDied","Data":"3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881"} Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.682250 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" event={"ID":"7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf","Type":"ContainerDied","Data":"c6f5d0966b1019af6eac0bb1edbfb58774a12e6c6a1fb3c3b55331fdb88386e8"} Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.682272 4728 scope.go:117] "RemoveContainer" containerID="3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.682279 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dd799fbcb-vr2rr" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.686546 4728 generic.go:334] "Generic (PLEG): container finished" podID="68e0932b-b003-420f-b476-f161a5b5193f" containerID="edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12" exitCode=0 Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.686579 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.686613 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" event={"ID":"68e0932b-b003-420f-b476-f161a5b5193f","Type":"ContainerDied","Data":"edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12"} Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.686685 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62" event={"ID":"68e0932b-b003-420f-b476-f161a5b5193f","Type":"ContainerDied","Data":"c23b0c3cb504707a4873ba8c038d9cdcc9dc9b542fe865285005b2da854862b6"} Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.686777 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5fdgm" podUID="b722868c-334c-4e92-a919-051140c48283" containerName="registry-server" containerID="cri-o://e18df7954027147ab0bb73754629f6cd84b0858d96f2f7c69c11aee60607c3b9" gracePeriod=2 Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.721867 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62"] Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.725694 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446b6f9d5-tbl62"] Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.739116 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dd799fbcb-vr2rr"] Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.741654 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-dd799fbcb-vr2rr"] Dec 16 15:00:28 crc kubenswrapper[4728]: I1216 15:00:28.764254 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:29 crc kubenswrapper[4728]: I1216 15:00:29.514773 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e0932b-b003-420f-b476-f161a5b5193f" path="/var/lib/kubelet/pods/68e0932b-b003-420f-b476-f161a5b5193f/volumes" Dec 16 15:00:29 crc kubenswrapper[4728]: I1216 15:00:29.515955 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf" path="/var/lib/kubelet/pods/7f6fcd00-7f6e-4f38-a3d8-fde5ca253abf/volumes" Dec 16 15:00:30 crc kubenswrapper[4728]: I1216 15:00:30.335066 4728 scope.go:117] "RemoveContainer" containerID="3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881" Dec 16 15:00:30 crc kubenswrapper[4728]: E1216 15:00:30.335795 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881\": container with ID starting with 3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881 not found: ID does not exist" containerID="3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881" Dec 16 15:00:30 crc kubenswrapper[4728]: I1216 15:00:30.335881 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881"} err="failed to get container status \"3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881\": rpc error: code = NotFound desc = could not find container \"3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881\": container with ID starting with 3822e4698e6c6569375a8f8d0d62216eefb2e287ff7840460002e39c89237881 not found: ID does not exist" Dec 16 15:00:30 crc kubenswrapper[4728]: I1216 15:00:30.335935 4728 scope.go:117] "RemoveContainer" containerID="edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12" Dec 16 15:00:30 crc kubenswrapper[4728]: I1216 15:00:30.702186 4728 generic.go:334] "Generic (PLEG): container finished" podID="b722868c-334c-4e92-a919-051140c48283" containerID="e18df7954027147ab0bb73754629f6cd84b0858d96f2f7c69c11aee60607c3b9" exitCode=0 Dec 16 15:00:30 crc kubenswrapper[4728]: I1216 15:00:30.702226 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdgm" event={"ID":"b722868c-334c-4e92-a919-051140c48283","Type":"ContainerDied","Data":"e18df7954027147ab0bb73754629f6cd84b0858d96f2f7c69c11aee60607c3b9"} Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.243722 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5"] Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.245211 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.249545 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.249701 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.249889 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.250095 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.250346 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.250569 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.256222 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5"] Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.381155 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9qwq\" (UniqueName: \"kubernetes.io/projected/ba637ae5-45df-40ef-bf20-9a66e079197e-kube-api-access-h9qwq\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.381219 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-config\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.381252 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba637ae5-45df-40ef-bf20-9a66e079197e-serving-cert\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.381273 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-client-ca\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.482517 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba637ae5-45df-40ef-bf20-9a66e079197e-serving-cert\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.482594 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-client-ca\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.482684 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9qwq\" (UniqueName: \"kubernetes.io/projected/ba637ae5-45df-40ef-bf20-9a66e079197e-kube-api-access-h9qwq\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.482760 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-config\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.484651 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-config\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.486921 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-client-ca\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.497899 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba637ae5-45df-40ef-bf20-9a66e079197e-serving-cert\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.513264 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9qwq\" (UniqueName: \"kubernetes.io/projected/ba637ae5-45df-40ef-bf20-9a66e079197e-kube-api-access-h9qwq\") pod \"route-controller-manager-8666567b68-qvgp5\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:31 crc kubenswrapper[4728]: I1216 15:00:31.577216 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:32 crc kubenswrapper[4728]: I1216 15:00:32.970274 4728 scope.go:117] "RemoveContainer" containerID="edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12" Dec 16 15:00:32 crc kubenswrapper[4728]: E1216 15:00:32.971093 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12\": container with ID starting with edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12 not found: ID does not exist" containerID="edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12" Dec 16 15:00:32 crc kubenswrapper[4728]: I1216 15:00:32.971128 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12"} err="failed to get container status \"edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12\": rpc error: code = NotFound desc = could not find container \"edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12\": container with ID starting with edf24e3e6877917795445794a5bb41e1be7d95bd617ff0096d4185cdc0b5ae12 not found: ID does not exist" Dec 16 15:00:33 crc kubenswrapper[4728]: I1216 15:00:33.742402 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdgm" event={"ID":"b722868c-334c-4e92-a919-051140c48283","Type":"ContainerDied","Data":"30374b7be49ab8ceed38149861a1e2f0712ec4c92d10f9d0c97e5b2a9997a094"} Dec 16 15:00:33 crc kubenswrapper[4728]: I1216 15:00:33.743065 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30374b7be49ab8ceed38149861a1e2f0712ec4c92d10f9d0c97e5b2a9997a094" Dec 16 15:00:33 crc kubenswrapper[4728]: I1216 15:00:33.827309 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 15:00:33 crc kubenswrapper[4728]: I1216 15:00:33.925577 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g97c5\" (UniqueName: \"kubernetes.io/projected/b722868c-334c-4e92-a919-051140c48283-kube-api-access-g97c5\") pod \"b722868c-334c-4e92-a919-051140c48283\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " Dec 16 15:00:33 crc kubenswrapper[4728]: I1216 15:00:33.925676 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-catalog-content\") pod \"b722868c-334c-4e92-a919-051140c48283\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " Dec 16 15:00:33 crc kubenswrapper[4728]: I1216 15:00:33.925711 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-utilities\") pod \"b722868c-334c-4e92-a919-051140c48283\" (UID: \"b722868c-334c-4e92-a919-051140c48283\") " Dec 16 15:00:33 crc kubenswrapper[4728]: I1216 15:00:33.926671 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-utilities" (OuterVolumeSpecName: "utilities") pod "b722868c-334c-4e92-a919-051140c48283" (UID: "b722868c-334c-4e92-a919-051140c48283"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:33 crc kubenswrapper[4728]: I1216 15:00:33.936802 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b722868c-334c-4e92-a919-051140c48283-kube-api-access-g97c5" (OuterVolumeSpecName: "kube-api-access-g97c5") pod "b722868c-334c-4e92-a919-051140c48283" (UID: "b722868c-334c-4e92-a919-051140c48283"). InnerVolumeSpecName "kube-api-access-g97c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.026923 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g97c5\" (UniqueName: \"kubernetes.io/projected/b722868c-334c-4e92-a919-051140c48283-kube-api-access-g97c5\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.026971 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.037852 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b722868c-334c-4e92-a919-051140c48283" (UID: "b722868c-334c-4e92-a919-051140c48283"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.128216 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b722868c-334c-4e92-a919-051140c48283-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.141703 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6774cbd74c-pjllj"] Dec 16 15:00:34 crc kubenswrapper[4728]: W1216 15:00:34.153860 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd097d82c_1ab8_4c6e_80e2_71dcb09ac05f.slice/crio-ee1b95c6fe368d2324eb27e266c9827b7024942f49663f6cf5744f4b4434e6a8 WatchSource:0}: Error finding container ee1b95c6fe368d2324eb27e266c9827b7024942f49663f6cf5744f4b4434e6a8: Status 404 returned error can't find the container with id ee1b95c6fe368d2324eb27e266c9827b7024942f49663f6cf5744f4b4434e6a8 Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.259331 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5"] Dec 16 15:00:34 crc kubenswrapper[4728]: W1216 15:00:34.278649 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba637ae5_45df_40ef_bf20_9a66e079197e.slice/crio-a634f24ba8aba70babd37bdc0f1518ee126171ff84cc1f80bdb5dfbf20589a0c WatchSource:0}: Error finding container a634f24ba8aba70babd37bdc0f1518ee126171ff84cc1f80bdb5dfbf20589a0c: Status 404 returned error can't find the container with id a634f24ba8aba70babd37bdc0f1518ee126171ff84cc1f80bdb5dfbf20589a0c Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.769281 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zk7d" event={"ID":"8b64cba1-1ce1-4715-bd9f-831d3db30fc2","Type":"ContainerStarted","Data":"e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331"} Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.771108 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" event={"ID":"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f","Type":"ContainerStarted","Data":"8705fc1331cf6481131f39dc260cecb20e10f186736dca332b095126cab23cbe"} Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.771136 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" event={"ID":"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f","Type":"ContainerStarted","Data":"ee1b95c6fe368d2324eb27e266c9827b7024942f49663f6cf5744f4b4434e6a8"} Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.772104 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" event={"ID":"ba637ae5-45df-40ef-bf20-9a66e079197e","Type":"ContainerStarted","Data":"a634f24ba8aba70babd37bdc0f1518ee126171ff84cc1f80bdb5dfbf20589a0c"} Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.773704 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgppn" event={"ID":"c5d6795c-254f-428c-9fc2-c37b2e224b54","Type":"ContainerStarted","Data":"a1a04c3d312fa4b3ee2c5c2de1af60c5c59c6f2d639e6836598ff5d1a7f2e2cb"} Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.775420 4728 generic.go:334] "Generic (PLEG): container finished" podID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" containerID="f0f07265b331314df4d207f296692fdc71d3779a0a99bd06990473e4d93f2f64" exitCode=0 Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.775447 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs9cj" event={"ID":"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa","Type":"ContainerDied","Data":"f0f07265b331314df4d207f296692fdc71d3779a0a99bd06990473e4d93f2f64"} Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.777387 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwvp" event={"ID":"4dccc964-0fe8-499e-a852-d971100829c1","Type":"ContainerStarted","Data":"89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f"} Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.779158 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxlrz" event={"ID":"c89065be-d4d7-4201-b4fd-f1bc18df6a60","Type":"ContainerStarted","Data":"9eb8c21a19472ffec46d107bf6767cf77b9eac889d419d27c130ea70bc11c2a8"} Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.780628 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vw5z" event={"ID":"e74a33ea-23b7-47fc-a463-566f8b579917","Type":"ContainerStarted","Data":"feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7"} Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.780675 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdgm" Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.805901 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bxlrz" podStartSLOduration=2.969499879 podStartE2EDuration="1m1.805877911s" podCreationTimestamp="2025-12-16 14:59:33 +0000 UTC" firstStartedPulling="2025-12-16 14:59:34.922621044 +0000 UTC m=+155.762800038" lastFinishedPulling="2025-12-16 15:00:33.758999046 +0000 UTC m=+214.599178070" observedRunningTime="2025-12-16 15:00:34.805648285 +0000 UTC m=+215.645827299" watchObservedRunningTime="2025-12-16 15:00:34.805877911 +0000 UTC m=+215.646056895" Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.822329 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5fdgm"] Dec 16 15:00:34 crc kubenswrapper[4728]: I1216 15:00:34.827738 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5fdgm"] Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.511858 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b722868c-334c-4e92-a919-051140c48283" path="/var/lib/kubelet/pods/b722868c-334c-4e92-a919-051140c48283/volumes" Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.787317 4728 generic.go:334] "Generic (PLEG): container finished" podID="c5d6795c-254f-428c-9fc2-c37b2e224b54" containerID="a1a04c3d312fa4b3ee2c5c2de1af60c5c59c6f2d639e6836598ff5d1a7f2e2cb" exitCode=0 Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.787393 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgppn" event={"ID":"c5d6795c-254f-428c-9fc2-c37b2e224b54","Type":"ContainerDied","Data":"a1a04c3d312fa4b3ee2c5c2de1af60c5c59c6f2d639e6836598ff5d1a7f2e2cb"} Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.789723 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5541d34-e213-4545-af81-6410a52db88d" containerID="deac880a9a055f0fe191962c30f98cd232afd7bf438f6bcf66f21eaa1e144559" exitCode=0 Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.789757 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78qzz" event={"ID":"d5541d34-e213-4545-af81-6410a52db88d","Type":"ContainerDied","Data":"deac880a9a055f0fe191962c30f98cd232afd7bf438f6bcf66f21eaa1e144559"} Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.792215 4728 generic.go:334] "Generic (PLEG): container finished" podID="4dccc964-0fe8-499e-a852-d971100829c1" containerID="89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f" exitCode=0 Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.792268 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwvp" event={"ID":"4dccc964-0fe8-499e-a852-d971100829c1","Type":"ContainerDied","Data":"89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f"} Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.796951 4728 generic.go:334] "Generic (PLEG): container finished" podID="e74a33ea-23b7-47fc-a463-566f8b579917" containerID="feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7" exitCode=0 Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.796995 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vw5z" event={"ID":"e74a33ea-23b7-47fc-a463-566f8b579917","Type":"ContainerDied","Data":"feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7"} Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.799915 4728 generic.go:334] "Generic (PLEG): container finished" podID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" containerID="e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331" exitCode=0 Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.799997 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zk7d" event={"ID":"8b64cba1-1ce1-4715-bd9f-831d3db30fc2","Type":"ContainerDied","Data":"e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331"} Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.803586 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" event={"ID":"ba637ae5-45df-40ef-bf20-9a66e079197e","Type":"ContainerStarted","Data":"77c9d630f04930ee3f47dc513ccdd11937268a0a59fa7e87b965426c36a3c6ce"} Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.804210 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.808569 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.895461 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" podStartSLOduration=9.895443910000001 podStartE2EDuration="9.89544391s" podCreationTimestamp="2025-12-16 15:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:35.890755092 +0000 UTC m=+216.730934076" watchObservedRunningTime="2025-12-16 15:00:35.89544391 +0000 UTC m=+216.735622894" Dec 16 15:00:35 crc kubenswrapper[4728]: I1216 15:00:35.911053 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" podStartSLOduration=9.911035162 podStartE2EDuration="9.911035162s" podCreationTimestamp="2025-12-16 15:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:35.910020267 +0000 UTC m=+216.750199251" watchObservedRunningTime="2025-12-16 15:00:35.911035162 +0000 UTC m=+216.751214146" Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.811645 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs9cj" event={"ID":"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa","Type":"ContainerStarted","Data":"6e57f507eb1e0ef100ab944fecfb501a431808f11435188daf9c056f7d11f23d"} Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.816597 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78qzz" event={"ID":"d5541d34-e213-4545-af81-6410a52db88d","Type":"ContainerStarted","Data":"a691516666a2d18185742e5597d9ba7b7b07b0b3fe5c2225eec93cbffde2feb2"} Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.827519 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwvp" event={"ID":"4dccc964-0fe8-499e-a852-d971100829c1","Type":"ContainerStarted","Data":"1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6"} Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.833938 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vw5z" event={"ID":"e74a33ea-23b7-47fc-a463-566f8b579917","Type":"ContainerStarted","Data":"8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6"} Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.836732 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zk7d" event={"ID":"8b64cba1-1ce1-4715-bd9f-831d3db30fc2","Type":"ContainerStarted","Data":"080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c"} Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.840100 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgppn" event={"ID":"c5d6795c-254f-428c-9fc2-c37b2e224b54","Type":"ContainerStarted","Data":"8f2abbf346de87e3d9e27aa179154aabf71a697abedd8b573eac53cd71bc9d1d"} Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.840693 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.844334 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fs9cj" podStartSLOduration=3.170212087 podStartE2EDuration="1m3.844315341s" podCreationTimestamp="2025-12-16 14:59:33 +0000 UTC" firstStartedPulling="2025-12-16 14:59:36.002350145 +0000 UTC m=+156.842529129" lastFinishedPulling="2025-12-16 15:00:36.676453399 +0000 UTC m=+217.516632383" observedRunningTime="2025-12-16 15:00:36.838051683 +0000 UTC m=+217.678230687" watchObservedRunningTime="2025-12-16 15:00:36.844315341 +0000 UTC m=+217.684494325" Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.845592 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.863826 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9zk7d" podStartSLOduration=3.402412713 podStartE2EDuration="1m5.863807471s" podCreationTimestamp="2025-12-16 14:59:31 +0000 UTC" firstStartedPulling="2025-12-16 14:59:33.88651094 +0000 UTC m=+154.726689924" lastFinishedPulling="2025-12-16 15:00:36.347905688 +0000 UTC m=+217.188084682" observedRunningTime="2025-12-16 15:00:36.86181032 +0000 UTC m=+217.701989304" watchObservedRunningTime="2025-12-16 15:00:36.863807471 +0000 UTC m=+217.703986455" Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.879069 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hlwvp" podStartSLOduration=2.383945192 podStartE2EDuration="1m5.879030773s" podCreationTimestamp="2025-12-16 14:59:31 +0000 UTC" firstStartedPulling="2025-12-16 14:59:32.873404464 +0000 UTC m=+153.713583448" lastFinishedPulling="2025-12-16 15:00:36.368490045 +0000 UTC m=+217.208669029" observedRunningTime="2025-12-16 15:00:36.876600833 +0000 UTC m=+217.716779817" watchObservedRunningTime="2025-12-16 15:00:36.879030773 +0000 UTC m=+217.719209757" Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.902072 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-78qzz" podStartSLOduration=2.486555713 podStartE2EDuration="1m5.902057013s" podCreationTimestamp="2025-12-16 14:59:31 +0000 UTC" firstStartedPulling="2025-12-16 14:59:32.871733462 +0000 UTC m=+153.711912446" lastFinishedPulling="2025-12-16 15:00:36.287234742 +0000 UTC m=+217.127413746" observedRunningTime="2025-12-16 15:00:36.898772289 +0000 UTC m=+217.738951283" watchObservedRunningTime="2025-12-16 15:00:36.902057013 +0000 UTC m=+217.742235997" Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.941134 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6vw5z" podStartSLOduration=2.305959131 podStartE2EDuration="1m2.941114095s" podCreationTimestamp="2025-12-16 14:59:34 +0000 UTC" firstStartedPulling="2025-12-16 14:59:36.01406333 +0000 UTC m=+156.854242314" lastFinishedPulling="2025-12-16 15:00:36.649218304 +0000 UTC m=+217.489397278" observedRunningTime="2025-12-16 15:00:36.920913726 +0000 UTC m=+217.761092710" watchObservedRunningTime="2025-12-16 15:00:36.941114095 +0000 UTC m=+217.781293079" Dec 16 15:00:36 crc kubenswrapper[4728]: I1216 15:00:36.962593 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rgppn" podStartSLOduration=2.575742774 podStartE2EDuration="1m5.962574164s" podCreationTimestamp="2025-12-16 14:59:31 +0000 UTC" firstStartedPulling="2025-12-16 14:59:32.865260189 +0000 UTC m=+153.705439173" lastFinishedPulling="2025-12-16 15:00:36.252091569 +0000 UTC m=+217.092270563" observedRunningTime="2025-12-16 15:00:36.957271171 +0000 UTC m=+217.797450165" watchObservedRunningTime="2025-12-16 15:00:36.962574164 +0000 UTC m=+217.802753158" Dec 16 15:00:38 crc kubenswrapper[4728]: I1216 15:00:38.818856 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:00:38 crc kubenswrapper[4728]: I1216 15:00:38.819232 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:00:38 crc kubenswrapper[4728]: I1216 15:00:38.819310 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:00:38 crc kubenswrapper[4728]: I1216 15:00:38.820059 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:00:38 crc kubenswrapper[4728]: I1216 15:00:38.820161 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea" gracePeriod=600 Dec 16 15:00:39 crc kubenswrapper[4728]: I1216 15:00:39.860355 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea" exitCode=0 Dec 16 15:00:39 crc kubenswrapper[4728]: I1216 15:00:39.860522 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea"} Dec 16 15:00:40 crc kubenswrapper[4728]: I1216 15:00:40.868479 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"4bda00ce73e1c1ab471f206d48aed0e38d16bcd1f6b879870ad51db12f879d97"} Dec 16 15:00:41 crc kubenswrapper[4728]: I1216 15:00:41.707153 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rgppn" Dec 16 15:00:41 crc kubenswrapper[4728]: I1216 15:00:41.707611 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rgppn" Dec 16 15:00:41 crc kubenswrapper[4728]: I1216 15:00:41.767852 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rgppn" Dec 16 15:00:41 crc kubenswrapper[4728]: I1216 15:00:41.910529 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-78qzz" Dec 16 15:00:41 crc kubenswrapper[4728]: I1216 15:00:41.910668 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-78qzz" Dec 16 15:00:41 crc kubenswrapper[4728]: I1216 15:00:41.936230 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rgppn" Dec 16 15:00:41 crc kubenswrapper[4728]: I1216 15:00:41.960832 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-78qzz" Dec 16 15:00:42 crc kubenswrapper[4728]: I1216 15:00:42.145615 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hlwvp" Dec 16 15:00:42 crc kubenswrapper[4728]: I1216 15:00:42.145670 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hlwvp" Dec 16 15:00:42 crc kubenswrapper[4728]: I1216 15:00:42.213608 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hlwvp" Dec 16 15:00:42 crc kubenswrapper[4728]: I1216 15:00:42.328671 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 15:00:42 crc kubenswrapper[4728]: I1216 15:00:42.328737 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 15:00:42 crc kubenswrapper[4728]: I1216 15:00:42.392046 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 15:00:42 crc kubenswrapper[4728]: I1216 15:00:42.920897 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hlwvp" Dec 16 15:00:42 crc kubenswrapper[4728]: I1216 15:00:42.929486 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 15:00:42 crc kubenswrapper[4728]: I1216 15:00:42.932055 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-78qzz" Dec 16 15:00:43 crc kubenswrapper[4728]: I1216 15:00:43.722450 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 15:00:43 crc kubenswrapper[4728]: I1216 15:00:43.722848 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 15:00:43 crc kubenswrapper[4728]: I1216 15:00:43.773902 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 15:00:43 crc kubenswrapper[4728]: I1216 15:00:43.945632 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 15:00:44 crc kubenswrapper[4728]: I1216 15:00:44.158667 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 15:00:44 crc kubenswrapper[4728]: I1216 15:00:44.158913 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 15:00:44 crc kubenswrapper[4728]: I1216 15:00:44.220906 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 15:00:44 crc kubenswrapper[4728]: I1216 15:00:44.346672 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9zk7d"] Dec 16 15:00:44 crc kubenswrapper[4728]: I1216 15:00:44.547538 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlwvp"] Dec 16 15:00:44 crc kubenswrapper[4728]: I1216 15:00:44.891201 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hlwvp" podUID="4dccc964-0fe8-499e-a852-d971100829c1" containerName="registry-server" containerID="cri-o://1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6" gracePeriod=2 Dec 16 15:00:44 crc kubenswrapper[4728]: I1216 15:00:44.891744 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9zk7d" podUID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" containerName="registry-server" containerID="cri-o://080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c" gracePeriod=2 Dec 16 15:00:44 crc kubenswrapper[4728]: I1216 15:00:44.949118 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 15:00:44 crc kubenswrapper[4728]: I1216 15:00:44.951126 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 15:00:44 crc kubenswrapper[4728]: I1216 15:00:44.951586 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 15:00:44 crc kubenswrapper[4728]: I1216 15:00:44.986264 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.412223 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.416945 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlwvp" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.522120 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-utilities\") pod \"4dccc964-0fe8-499e-a852-d971100829c1\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.522170 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-utilities\") pod \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.522219 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-catalog-content\") pod \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.522249 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgf4d\" (UniqueName: \"kubernetes.io/projected/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-kube-api-access-qgf4d\") pod \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\" (UID: \"8b64cba1-1ce1-4715-bd9f-831d3db30fc2\") " Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.522313 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvg5p\" (UniqueName: \"kubernetes.io/projected/4dccc964-0fe8-499e-a852-d971100829c1-kube-api-access-mvg5p\") pod \"4dccc964-0fe8-499e-a852-d971100829c1\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.522333 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-catalog-content\") pod \"4dccc964-0fe8-499e-a852-d971100829c1\" (UID: \"4dccc964-0fe8-499e-a852-d971100829c1\") " Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.522906 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-utilities" (OuterVolumeSpecName: "utilities") pod "4dccc964-0fe8-499e-a852-d971100829c1" (UID: "4dccc964-0fe8-499e-a852-d971100829c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.524987 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-utilities" (OuterVolumeSpecName: "utilities") pod "8b64cba1-1ce1-4715-bd9f-831d3db30fc2" (UID: "8b64cba1-1ce1-4715-bd9f-831d3db30fc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.528990 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dccc964-0fe8-499e-a852-d971100829c1-kube-api-access-mvg5p" (OuterVolumeSpecName: "kube-api-access-mvg5p") pod "4dccc964-0fe8-499e-a852-d971100829c1" (UID: "4dccc964-0fe8-499e-a852-d971100829c1"). InnerVolumeSpecName "kube-api-access-mvg5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.529202 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-kube-api-access-qgf4d" (OuterVolumeSpecName: "kube-api-access-qgf4d") pod "8b64cba1-1ce1-4715-bd9f-831d3db30fc2" (UID: "8b64cba1-1ce1-4715-bd9f-831d3db30fc2"). InnerVolumeSpecName "kube-api-access-qgf4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.593941 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b64cba1-1ce1-4715-bd9f-831d3db30fc2" (UID: "8b64cba1-1ce1-4715-bd9f-831d3db30fc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.601267 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4dccc964-0fe8-499e-a852-d971100829c1" (UID: "4dccc964-0fe8-499e-a852-d971100829c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.624309 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvg5p\" (UniqueName: \"kubernetes.io/projected/4dccc964-0fe8-499e-a852-d971100829c1-kube-api-access-mvg5p\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.624340 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.624350 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dccc964-0fe8-499e-a852-d971100829c1-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.624359 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.624369 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.624377 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgf4d\" (UniqueName: \"kubernetes.io/projected/8b64cba1-1ce1-4715-bd9f-831d3db30fc2-kube-api-access-qgf4d\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.899357 4728 generic.go:334] "Generic (PLEG): container finished" podID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" containerID="080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c" exitCode=0 Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.899436 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zk7d" event={"ID":"8b64cba1-1ce1-4715-bd9f-831d3db30fc2","Type":"ContainerDied","Data":"080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c"} Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.899465 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zk7d" event={"ID":"8b64cba1-1ce1-4715-bd9f-831d3db30fc2","Type":"ContainerDied","Data":"64fbd43d3eba207cbc8bddd1f145917f19a20a8663de7d551f80e7eef854a77d"} Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.899483 4728 scope.go:117] "RemoveContainer" containerID="080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.899504 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zk7d" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.902693 4728 generic.go:334] "Generic (PLEG): container finished" podID="4dccc964-0fe8-499e-a852-d971100829c1" containerID="1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6" exitCode=0 Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.902767 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlwvp" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.902774 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwvp" event={"ID":"4dccc964-0fe8-499e-a852-d971100829c1","Type":"ContainerDied","Data":"1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6"} Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.902801 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlwvp" event={"ID":"4dccc964-0fe8-499e-a852-d971100829c1","Type":"ContainerDied","Data":"a3cec02942e1678ae760a51c6373a09173f6ef2595bc4e31ec2e2eb1b8637e31"} Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.921000 4728 scope.go:117] "RemoveContainer" containerID="e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.939315 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9zk7d"] Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.941667 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9zk7d"] Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.948660 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.952430 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlwvp"] Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.955782 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hlwvp"] Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.974884 4728 scope.go:117] "RemoveContainer" containerID="5cfef1fd9d2a8befb2715a36800a27972d91a865ef4aab08f8e27030a05488f0" Dec 16 15:00:45 crc kubenswrapper[4728]: I1216 15:00:45.998475 4728 scope.go:117] "RemoveContainer" containerID="080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c" Dec 16 15:00:46 crc kubenswrapper[4728]: E1216 15:00:46.003352 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c\": container with ID starting with 080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c not found: ID does not exist" containerID="080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.003392 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c"} err="failed to get container status \"080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c\": rpc error: code = NotFound desc = could not find container \"080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c\": container with ID starting with 080b04a988520b3f3f2ed1055b26331a66043f42b4201c49e05b33ef42569a7c not found: ID does not exist" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.003448 4728 scope.go:117] "RemoveContainer" containerID="e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331" Dec 16 15:00:46 crc kubenswrapper[4728]: E1216 15:00:46.003859 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331\": container with ID starting with e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331 not found: ID does not exist" containerID="e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.003884 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331"} err="failed to get container status \"e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331\": rpc error: code = NotFound desc = could not find container \"e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331\": container with ID starting with e49db50457ccbd8488393d00c3f046fcf32e98b777e96acd719d95bf3a2ee331 not found: ID does not exist" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.003920 4728 scope.go:117] "RemoveContainer" containerID="5cfef1fd9d2a8befb2715a36800a27972d91a865ef4aab08f8e27030a05488f0" Dec 16 15:00:46 crc kubenswrapper[4728]: E1216 15:00:46.004295 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfef1fd9d2a8befb2715a36800a27972d91a865ef4aab08f8e27030a05488f0\": container with ID starting with 5cfef1fd9d2a8befb2715a36800a27972d91a865ef4aab08f8e27030a05488f0 not found: ID does not exist" containerID="5cfef1fd9d2a8befb2715a36800a27972d91a865ef4aab08f8e27030a05488f0" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.004323 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfef1fd9d2a8befb2715a36800a27972d91a865ef4aab08f8e27030a05488f0"} err="failed to get container status \"5cfef1fd9d2a8befb2715a36800a27972d91a865ef4aab08f8e27030a05488f0\": rpc error: code = NotFound desc = could not find container \"5cfef1fd9d2a8befb2715a36800a27972d91a865ef4aab08f8e27030a05488f0\": container with ID starting with 5cfef1fd9d2a8befb2715a36800a27972d91a865ef4aab08f8e27030a05488f0 not found: ID does not exist" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.004340 4728 scope.go:117] "RemoveContainer" containerID="1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.029975 4728 scope.go:117] "RemoveContainer" containerID="89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.052951 4728 scope.go:117] "RemoveContainer" containerID="fe244bc3aac51d1344f710b4942bf00dee6b233f125ddac8c4efb94f39f8d39c" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.065294 4728 scope.go:117] "RemoveContainer" containerID="1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6" Dec 16 15:00:46 crc kubenswrapper[4728]: E1216 15:00:46.065760 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6\": container with ID starting with 1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6 not found: ID does not exist" containerID="1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.065825 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6"} err="failed to get container status \"1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6\": rpc error: code = NotFound desc = could not find container \"1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6\": container with ID starting with 1b347df1a548fb45eaf5c0d80ab4c7a2334eb64cb340a14a0b0b11400546adf6 not found: ID does not exist" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.065859 4728 scope.go:117] "RemoveContainer" containerID="89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f" Dec 16 15:00:46 crc kubenswrapper[4728]: E1216 15:00:46.066158 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f\": container with ID starting with 89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f not found: ID does not exist" containerID="89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.066187 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f"} err="failed to get container status \"89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f\": rpc error: code = NotFound desc = could not find container \"89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f\": container with ID starting with 89358097c9632bf8eb20849b568a3fe9ca2e7bbdc5f9415c69144c0e531e135f not found: ID does not exist" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.066207 4728 scope.go:117] "RemoveContainer" containerID="fe244bc3aac51d1344f710b4942bf00dee6b233f125ddac8c4efb94f39f8d39c" Dec 16 15:00:46 crc kubenswrapper[4728]: E1216 15:00:46.066476 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe244bc3aac51d1344f710b4942bf00dee6b233f125ddac8c4efb94f39f8d39c\": container with ID starting with fe244bc3aac51d1344f710b4942bf00dee6b233f125ddac8c4efb94f39f8d39c not found: ID does not exist" containerID="fe244bc3aac51d1344f710b4942bf00dee6b233f125ddac8c4efb94f39f8d39c" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.066524 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe244bc3aac51d1344f710b4942bf00dee6b233f125ddac8c4efb94f39f8d39c"} err="failed to get container status \"fe244bc3aac51d1344f710b4942bf00dee6b233f125ddac8c4efb94f39f8d39c\": rpc error: code = NotFound desc = could not find container \"fe244bc3aac51d1344f710b4942bf00dee6b233f125ddac8c4efb94f39f8d39c\": container with ID starting with fe244bc3aac51d1344f710b4942bf00dee6b233f125ddac8c4efb94f39f8d39c not found: ID does not exist" Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.644144 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6774cbd74c-pjllj"] Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.644663 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" podUID="d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" containerName="controller-manager" containerID="cri-o://8705fc1331cf6481131f39dc260cecb20e10f186736dca332b095126cab23cbe" gracePeriod=30 Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.743589 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs9cj"] Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.746024 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5"] Dec 16 15:00:46 crc kubenswrapper[4728]: I1216 15:00:46.746220 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" podUID="ba637ae5-45df-40ef-bf20-9a66e079197e" containerName="route-controller-manager" containerID="cri-o://77c9d630f04930ee3f47dc513ccdd11937268a0a59fa7e87b965426c36a3c6ce" gracePeriod=30 Dec 16 15:00:47 crc kubenswrapper[4728]: I1216 15:00:47.518755 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dccc964-0fe8-499e-a852-d971100829c1" path="/var/lib/kubelet/pods/4dccc964-0fe8-499e-a852-d971100829c1/volumes" Dec 16 15:00:47 crc kubenswrapper[4728]: I1216 15:00:47.519684 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" path="/var/lib/kubelet/pods/8b64cba1-1ce1-4715-bd9f-831d3db30fc2/volumes" Dec 16 15:00:47 crc kubenswrapper[4728]: I1216 15:00:47.916309 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fs9cj" podUID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" containerName="registry-server" containerID="cri-o://6e57f507eb1e0ef100ab944fecfb501a431808f11435188daf9c056f7d11f23d" gracePeriod=2 Dec 16 15:00:47 crc kubenswrapper[4728]: I1216 15:00:47.977922 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h6f6v"] Dec 16 15:00:48 crc kubenswrapper[4728]: I1216 15:00:48.765848 4728 patch_prober.go:28] interesting pod/controller-manager-6774cbd74c-pjllj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Dec 16 15:00:48 crc kubenswrapper[4728]: I1216 15:00:48.766120 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" podUID="d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Dec 16 15:00:48 crc kubenswrapper[4728]: I1216 15:00:48.930910 4728 generic.go:334] "Generic (PLEG): container finished" podID="d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" containerID="8705fc1331cf6481131f39dc260cecb20e10f186736dca332b095126cab23cbe" exitCode=0 Dec 16 15:00:48 crc kubenswrapper[4728]: I1216 15:00:48.930974 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" event={"ID":"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f","Type":"ContainerDied","Data":"8705fc1331cf6481131f39dc260cecb20e10f186736dca332b095126cab23cbe"} Dec 16 15:00:48 crc kubenswrapper[4728]: I1216 15:00:48.932064 4728 generic.go:334] "Generic (PLEG): container finished" podID="ba637ae5-45df-40ef-bf20-9a66e079197e" containerID="77c9d630f04930ee3f47dc513ccdd11937268a0a59fa7e87b965426c36a3c6ce" exitCode=0 Dec 16 15:00:48 crc kubenswrapper[4728]: I1216 15:00:48.932105 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" event={"ID":"ba637ae5-45df-40ef-bf20-9a66e079197e","Type":"ContainerDied","Data":"77c9d630f04930ee3f47dc513ccdd11937268a0a59fa7e87b965426c36a3c6ce"} Dec 16 15:00:48 crc kubenswrapper[4728]: I1216 15:00:48.933931 4728 generic.go:334] "Generic (PLEG): container finished" podID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" containerID="6e57f507eb1e0ef100ab944fecfb501a431808f11435188daf9c056f7d11f23d" exitCode=0 Dec 16 15:00:48 crc kubenswrapper[4728]: I1216 15:00:48.933975 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs9cj" event={"ID":"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa","Type":"ContainerDied","Data":"6e57f507eb1e0ef100ab944fecfb501a431808f11435188daf9c056f7d11f23d"} Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.729423 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.734528 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.737169 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.798854 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5fr2\" (UniqueName: \"kubernetes.io/projected/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-kube-api-access-f5fr2\") pod \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.798896 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-utilities\") pod \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.798920 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-proxy-ca-bundles\") pod \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.798940 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-catalog-content\") pod \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\" (UID: \"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.798959 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlfkl\" (UniqueName: \"kubernetes.io/projected/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-kube-api-access-vlfkl\") pod \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.798984 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba637ae5-45df-40ef-bf20-9a66e079197e-serving-cert\") pod \"ba637ae5-45df-40ef-bf20-9a66e079197e\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.798999 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-serving-cert\") pod \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.799013 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-config\") pod \"ba637ae5-45df-40ef-bf20-9a66e079197e\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.799030 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-config\") pod \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.799058 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9qwq\" (UniqueName: \"kubernetes.io/projected/ba637ae5-45df-40ef-bf20-9a66e079197e-kube-api-access-h9qwq\") pod \"ba637ae5-45df-40ef-bf20-9a66e079197e\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.799080 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-client-ca\") pod \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\" (UID: \"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.799099 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-client-ca\") pod \"ba637ae5-45df-40ef-bf20-9a66e079197e\" (UID: \"ba637ae5-45df-40ef-bf20-9a66e079197e\") " Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.802320 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" (UID: "d097d82c-1ab8-4c6e-80e2-71dcb09ac05f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.802320 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-client-ca" (OuterVolumeSpecName: "client-ca") pod "ba637ae5-45df-40ef-bf20-9a66e079197e" (UID: "ba637ae5-45df-40ef-bf20-9a66e079197e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.802530 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-utilities" (OuterVolumeSpecName: "utilities") pod "2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" (UID: "2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.806024 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-config" (OuterVolumeSpecName: "config") pod "ba637ae5-45df-40ef-bf20-9a66e079197e" (UID: "ba637ae5-45df-40ef-bf20-9a66e079197e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.806664 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-config" (OuterVolumeSpecName: "config") pod "d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" (UID: "d097d82c-1ab8-4c6e-80e2-71dcb09ac05f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.809745 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-client-ca" (OuterVolumeSpecName: "client-ca") pod "d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" (UID: "d097d82c-1ab8-4c6e-80e2-71dcb09ac05f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.811171 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-kube-api-access-vlfkl" (OuterVolumeSpecName: "kube-api-access-vlfkl") pod "d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" (UID: "d097d82c-1ab8-4c6e-80e2-71dcb09ac05f"). InnerVolumeSpecName "kube-api-access-vlfkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.813142 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba637ae5-45df-40ef-bf20-9a66e079197e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ba637ae5-45df-40ef-bf20-9a66e079197e" (UID: "ba637ae5-45df-40ef-bf20-9a66e079197e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.822096 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" (UID: "d097d82c-1ab8-4c6e-80e2-71dcb09ac05f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.822228 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-kube-api-access-f5fr2" (OuterVolumeSpecName: "kube-api-access-f5fr2") pod "2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" (UID: "2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa"). InnerVolumeSpecName "kube-api-access-f5fr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.825614 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba637ae5-45df-40ef-bf20-9a66e079197e-kube-api-access-h9qwq" (OuterVolumeSpecName: "kube-api-access-h9qwq") pod "ba637ae5-45df-40ef-bf20-9a66e079197e" (UID: "ba637ae5-45df-40ef-bf20-9a66e079197e"). InnerVolumeSpecName "kube-api-access-h9qwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.837012 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" (UID: "2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.901897 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.901934 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9qwq\" (UniqueName: \"kubernetes.io/projected/ba637ae5-45df-40ef-bf20-9a66e079197e-kube-api-access-h9qwq\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.901947 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.901969 4728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-client-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.901977 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5fr2\" (UniqueName: \"kubernetes.io/projected/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-kube-api-access-f5fr2\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.901985 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.901993 4728 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.902000 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.902008 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlfkl\" (UniqueName: \"kubernetes.io/projected/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-kube-api-access-vlfkl\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.902016 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba637ae5-45df-40ef-bf20-9a66e079197e-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.902024 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba637ae5-45df-40ef-bf20-9a66e079197e-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.902032 4728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.938739 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" event={"ID":"d097d82c-1ab8-4c6e-80e2-71dcb09ac05f","Type":"ContainerDied","Data":"ee1b95c6fe368d2324eb27e266c9827b7024942f49663f6cf5744f4b4434e6a8"} Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.938789 4728 scope.go:117] "RemoveContainer" containerID="8705fc1331cf6481131f39dc260cecb20e10f186736dca332b095126cab23cbe" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.938911 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6774cbd74c-pjllj" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.943503 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" event={"ID":"ba637ae5-45df-40ef-bf20-9a66e079197e","Type":"ContainerDied","Data":"a634f24ba8aba70babd37bdc0f1518ee126171ff84cc1f80bdb5dfbf20589a0c"} Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.943546 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.946374 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs9cj" event={"ID":"2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa","Type":"ContainerDied","Data":"46710557a3ddc54d976c951a44b2f53e2f820d11981c0075af285f1ae874b56b"} Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.946489 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs9cj" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.963568 4728 scope.go:117] "RemoveContainer" containerID="77c9d630f04930ee3f47dc513ccdd11937268a0a59fa7e87b965426c36a3c6ce" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.977108 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6774cbd74c-pjllj"] Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.979258 4728 scope.go:117] "RemoveContainer" containerID="6e57f507eb1e0ef100ab944fecfb501a431808f11435188daf9c056f7d11f23d" Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.981885 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6774cbd74c-pjllj"] Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.996930 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5"] Dec 16 15:00:49 crc kubenswrapper[4728]: I1216 15:00:49.998050 4728 scope.go:117] "RemoveContainer" containerID="f0f07265b331314df4d207f296692fdc71d3779a0a99bd06990473e4d93f2f64" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.000161 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8666567b68-qvgp5"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.002501 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs9cj"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.005037 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs9cj"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.012865 4728 scope.go:117] "RemoveContainer" containerID="f4bc2a5c807793c97651ae3c5d100c260895ba259e71795f2f21c6c9826d9a96" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.241486 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-78qzz"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.242030 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-78qzz" podUID="d5541d34-e213-4545-af81-6410a52db88d" containerName="registry-server" containerID="cri-o://a691516666a2d18185742e5597d9ba7b7b07b0b3fe5c2225eec93cbffde2feb2" gracePeriod=30 Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.250003 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rgppn"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.250285 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rgppn" podUID="c5d6795c-254f-428c-9fc2-c37b2e224b54" containerName="registry-server" containerID="cri-o://8f2abbf346de87e3d9e27aa179154aabf71a697abedd8b573eac53cd71bc9d1d" gracePeriod=30 Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.254740 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cb68986fd-zrkh2"] Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255025 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b722868c-334c-4e92-a919-051140c48283" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255040 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b722868c-334c-4e92-a919-051140c48283" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255049 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" containerName="extract-content" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255056 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" containerName="extract-content" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255073 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" containerName="extract-content" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255079 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" containerName="extract-content" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255087 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dccc964-0fe8-499e-a852-d971100829c1" containerName="extract-utilities" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255093 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dccc964-0fe8-499e-a852-d971100829c1" containerName="extract-utilities" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255105 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b722868c-334c-4e92-a919-051140c48283" containerName="extract-utilities" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255111 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b722868c-334c-4e92-a919-051140c48283" containerName="extract-utilities" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255119 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba637ae5-45df-40ef-bf20-9a66e079197e" containerName="route-controller-manager" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255126 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba637ae5-45df-40ef-bf20-9a66e079197e" containerName="route-controller-manager" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255132 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dccc964-0fe8-499e-a852-d971100829c1" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255138 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dccc964-0fe8-499e-a852-d971100829c1" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255144 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b722868c-334c-4e92-a919-051140c48283" containerName="extract-content" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255150 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b722868c-334c-4e92-a919-051140c48283" containerName="extract-content" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255157 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255162 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255172 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255179 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255186 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" containerName="controller-manager" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255192 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" containerName="controller-manager" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255199 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" containerName="extract-utilities" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255204 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" containerName="extract-utilities" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255213 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dccc964-0fe8-499e-a852-d971100829c1" containerName="extract-content" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255218 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dccc964-0fe8-499e-a852-d971100829c1" containerName="extract-content" Dec 16 15:00:50 crc kubenswrapper[4728]: E1216 15:00:50.255228 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" containerName="extract-utilities" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255235 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" containerName="extract-utilities" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255350 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b64cba1-1ce1-4715-bd9f-831d3db30fc2" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255361 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dccc964-0fe8-499e-a852-d971100829c1" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255371 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" containerName="controller-manager" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255378 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b722868c-334c-4e92-a919-051140c48283" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255385 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba637ae5-45df-40ef-bf20-9a66e079197e" containerName="route-controller-manager" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255396 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" containerName="registry-server" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.255808 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.259385 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.259535 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.259541 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.259675 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.259799 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.259922 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.264224 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.264952 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.266888 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.267770 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.268949 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k6z5"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.269198 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" podUID="ca945ba8-363c-4e60-b11a-6938e4cb9354" containerName="marketplace-operator" containerID="cri-o://3f7db6888f46974d186c856ac3e6417cc401141d6d4952e0fc6f96412525d753" gracePeriod=30 Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.270234 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.275621 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.277525 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.280024 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.280142 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.280251 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.283475 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cb68986fd-zrkh2"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.295162 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxlrz"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.295461 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bxlrz" podUID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" containerName="registry-server" containerID="cri-o://9eb8c21a19472ffec46d107bf6767cf77b9eac889d419d27c130ea70bc11c2a8" gracePeriod=30 Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.297883 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dhkkk"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.298728 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.302682 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6vw5z"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.303131 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6vw5z" podUID="e74a33ea-23b7-47fc-a463-566f8b579917" containerName="registry-server" containerID="cri-o://8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6" gracePeriod=30 Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.305797 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54f118e6-46e7-4cd4-84fc-491746adedb2-config\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.306067 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54f118e6-46e7-4cd4-84fc-491746adedb2-client-ca\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.306185 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54f118e6-46e7-4cd4-84fc-491746adedb2-proxy-ca-bundles\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.306331 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26fa192c-181f-41a5-9e6e-cfa5defa2e56-config\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.306458 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6br7s\" (UniqueName: \"kubernetes.io/projected/54f118e6-46e7-4cd4-84fc-491746adedb2-kube-api-access-6br7s\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.306609 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26fa192c-181f-41a5-9e6e-cfa5defa2e56-serving-cert\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.306757 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2csp\" (UniqueName: \"kubernetes.io/projected/26fa192c-181f-41a5-9e6e-cfa5defa2e56-kube-api-access-l2csp\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.306862 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z25hp\" (UniqueName: \"kubernetes.io/projected/28557b66-a02a-4c9e-880f-3d9f21e5892b-kube-api-access-z25hp\") pod \"marketplace-operator-79b997595-dhkkk\" (UID: \"28557b66-a02a-4c9e-880f-3d9f21e5892b\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.306964 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54f118e6-46e7-4cd4-84fc-491746adedb2-serving-cert\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.307105 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28557b66-a02a-4c9e-880f-3d9f21e5892b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dhkkk\" (UID: \"28557b66-a02a-4c9e-880f-3d9f21e5892b\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.307181 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dhkkk"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.307301 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28557b66-a02a-4c9e-880f-3d9f21e5892b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dhkkk\" (UID: \"28557b66-a02a-4c9e-880f-3d9f21e5892b\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.307460 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26fa192c-181f-41a5-9e6e-cfa5defa2e56-client-ca\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408213 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2csp\" (UniqueName: \"kubernetes.io/projected/26fa192c-181f-41a5-9e6e-cfa5defa2e56-kube-api-access-l2csp\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408254 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z25hp\" (UniqueName: \"kubernetes.io/projected/28557b66-a02a-4c9e-880f-3d9f21e5892b-kube-api-access-z25hp\") pod \"marketplace-operator-79b997595-dhkkk\" (UID: \"28557b66-a02a-4c9e-880f-3d9f21e5892b\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408277 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54f118e6-46e7-4cd4-84fc-491746adedb2-serving-cert\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408293 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28557b66-a02a-4c9e-880f-3d9f21e5892b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dhkkk\" (UID: \"28557b66-a02a-4c9e-880f-3d9f21e5892b\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408312 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28557b66-a02a-4c9e-880f-3d9f21e5892b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dhkkk\" (UID: \"28557b66-a02a-4c9e-880f-3d9f21e5892b\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408329 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26fa192c-181f-41a5-9e6e-cfa5defa2e56-client-ca\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408352 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54f118e6-46e7-4cd4-84fc-491746adedb2-config\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408368 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54f118e6-46e7-4cd4-84fc-491746adedb2-client-ca\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408391 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54f118e6-46e7-4cd4-84fc-491746adedb2-proxy-ca-bundles\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408440 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26fa192c-181f-41a5-9e6e-cfa5defa2e56-config\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408464 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6br7s\" (UniqueName: \"kubernetes.io/projected/54f118e6-46e7-4cd4-84fc-491746adedb2-kube-api-access-6br7s\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.408486 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26fa192c-181f-41a5-9e6e-cfa5defa2e56-serving-cert\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.409555 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26fa192c-181f-41a5-9e6e-cfa5defa2e56-client-ca\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.409752 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54f118e6-46e7-4cd4-84fc-491746adedb2-client-ca\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.409931 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54f118e6-46e7-4cd4-84fc-491746adedb2-proxy-ca-bundles\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.410139 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54f118e6-46e7-4cd4-84fc-491746adedb2-config\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.410625 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26fa192c-181f-41a5-9e6e-cfa5defa2e56-config\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.412117 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54f118e6-46e7-4cd4-84fc-491746adedb2-serving-cert\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.412441 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26fa192c-181f-41a5-9e6e-cfa5defa2e56-serving-cert\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.413049 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28557b66-a02a-4c9e-880f-3d9f21e5892b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dhkkk\" (UID: \"28557b66-a02a-4c9e-880f-3d9f21e5892b\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.416204 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28557b66-a02a-4c9e-880f-3d9f21e5892b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dhkkk\" (UID: \"28557b66-a02a-4c9e-880f-3d9f21e5892b\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.422739 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z25hp\" (UniqueName: \"kubernetes.io/projected/28557b66-a02a-4c9e-880f-3d9f21e5892b-kube-api-access-z25hp\") pod \"marketplace-operator-79b997595-dhkkk\" (UID: \"28557b66-a02a-4c9e-880f-3d9f21e5892b\") " pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.425690 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6br7s\" (UniqueName: \"kubernetes.io/projected/54f118e6-46e7-4cd4-84fc-491746adedb2-kube-api-access-6br7s\") pod \"controller-manager-cb68986fd-zrkh2\" (UID: \"54f118e6-46e7-4cd4-84fc-491746adedb2\") " pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.425808 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2csp\" (UniqueName: \"kubernetes.io/projected/26fa192c-181f-41a5-9e6e-cfa5defa2e56-kube-api-access-l2csp\") pod \"route-controller-manager-585785d5f4-9l9dw\" (UID: \"26fa192c-181f-41a5-9e6e-cfa5defa2e56\") " pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.572649 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.583988 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.613079 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.842512 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cb68986fd-zrkh2"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.971399 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5541d34-e213-4545-af81-6410a52db88d" containerID="a691516666a2d18185742e5597d9ba7b7b07b0b3fe5c2225eec93cbffde2feb2" exitCode=0 Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.971436 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78qzz" event={"ID":"d5541d34-e213-4545-af81-6410a52db88d","Type":"ContainerDied","Data":"a691516666a2d18185742e5597d9ba7b7b07b0b3fe5c2225eec93cbffde2feb2"} Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.979435 4728 generic.go:334] "Generic (PLEG): container finished" podID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" containerID="9eb8c21a19472ffec46d107bf6767cf77b9eac889d419d27c130ea70bc11c2a8" exitCode=0 Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.979524 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxlrz" event={"ID":"c89065be-d4d7-4201-b4fd-f1bc18df6a60","Type":"ContainerDied","Data":"9eb8c21a19472ffec46d107bf6767cf77b9eac889d419d27c130ea70bc11c2a8"} Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.982159 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dhkkk"] Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.989978 4728 generic.go:334] "Generic (PLEG): container finished" podID="ca945ba8-363c-4e60-b11a-6938e4cb9354" containerID="3f7db6888f46974d186c856ac3e6417cc401141d6d4952e0fc6f96412525d753" exitCode=0 Dec 16 15:00:50 crc kubenswrapper[4728]: I1216 15:00:50.990086 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" event={"ID":"ca945ba8-363c-4e60-b11a-6938e4cb9354","Type":"ContainerDied","Data":"3f7db6888f46974d186c856ac3e6417cc401141d6d4952e0fc6f96412525d753"} Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.007451 4728 generic.go:334] "Generic (PLEG): container finished" podID="c5d6795c-254f-428c-9fc2-c37b2e224b54" containerID="8f2abbf346de87e3d9e27aa179154aabf71a697abedd8b573eac53cd71bc9d1d" exitCode=0 Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.007506 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgppn" event={"ID":"c5d6795c-254f-428c-9fc2-c37b2e224b54","Type":"ContainerDied","Data":"8f2abbf346de87e3d9e27aa179154aabf71a697abedd8b573eac53cd71bc9d1d"} Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.011671 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" event={"ID":"54f118e6-46e7-4cd4-84fc-491746adedb2","Type":"ContainerStarted","Data":"cd3598e0d12ed2ce36b20f2a798ca395c14d67e2250b7527b7d2760f711be46f"} Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.012002 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.013061 4728 patch_prober.go:28] interesting pod/controller-manager-cb68986fd-zrkh2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.013094 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" podUID="54f118e6-46e7-4cd4-84fc-491746adedb2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.045569 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" podStartSLOduration=5.045553529 podStartE2EDuration="5.045553529s" podCreationTimestamp="2025-12-16 15:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:51.04398578 +0000 UTC m=+231.884164774" watchObservedRunningTime="2025-12-16 15:00:51.045553529 +0000 UTC m=+231.885732513" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.157434 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw"] Dec 16 15:00:51 crc kubenswrapper[4728]: W1216 15:00:51.186000 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26fa192c_181f_41a5_9e6e_cfa5defa2e56.slice/crio-c0d35edf9aeff8adebfb2518fba7fcb12d14fb757327d7855d8216a20396bcb8 WatchSource:0}: Error finding container c0d35edf9aeff8adebfb2518fba7fcb12d14fb757327d7855d8216a20396bcb8: Status 404 returned error can't find the container with id c0d35edf9aeff8adebfb2518fba7fcb12d14fb757327d7855d8216a20396bcb8 Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.189958 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-78qzz" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.224850 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fffqc\" (UniqueName: \"kubernetes.io/projected/d5541d34-e213-4545-af81-6410a52db88d-kube-api-access-fffqc\") pod \"d5541d34-e213-4545-af81-6410a52db88d\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.224902 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-utilities\") pod \"d5541d34-e213-4545-af81-6410a52db88d\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.224971 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-catalog-content\") pod \"d5541d34-e213-4545-af81-6410a52db88d\" (UID: \"d5541d34-e213-4545-af81-6410a52db88d\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.228175 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-utilities" (OuterVolumeSpecName: "utilities") pod "d5541d34-e213-4545-af81-6410a52db88d" (UID: "d5541d34-e213-4545-af81-6410a52db88d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.234732 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5541d34-e213-4545-af81-6410a52db88d-kube-api-access-fffqc" (OuterVolumeSpecName: "kube-api-access-fffqc") pod "d5541d34-e213-4545-af81-6410a52db88d" (UID: "d5541d34-e213-4545-af81-6410a52db88d"). InnerVolumeSpecName "kube-api-access-fffqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.309306 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5541d34-e213-4545-af81-6410a52db88d" (UID: "d5541d34-e213-4545-af81-6410a52db88d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.326249 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fffqc\" (UniqueName: \"kubernetes.io/projected/d5541d34-e213-4545-af81-6410a52db88d-kube-api-access-fffqc\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.326290 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.326304 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5541d34-e213-4545-af81-6410a52db88d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.371602 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.386457 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgppn" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.428241 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-utilities\") pod \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.428299 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-catalog-content\") pod \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.428322 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-utilities\") pod \"c5d6795c-254f-428c-9fc2-c37b2e224b54\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.428355 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqwm5\" (UniqueName: \"kubernetes.io/projected/c89065be-d4d7-4201-b4fd-f1bc18df6a60-kube-api-access-vqwm5\") pod \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\" (UID: \"c89065be-d4d7-4201-b4fd-f1bc18df6a60\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.428391 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtvx2\" (UniqueName: \"kubernetes.io/projected/c5d6795c-254f-428c-9fc2-c37b2e224b54-kube-api-access-qtvx2\") pod \"c5d6795c-254f-428c-9fc2-c37b2e224b54\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.428457 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-catalog-content\") pod \"c5d6795c-254f-428c-9fc2-c37b2e224b54\" (UID: \"c5d6795c-254f-428c-9fc2-c37b2e224b54\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.429057 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-utilities" (OuterVolumeSpecName: "utilities") pod "c5d6795c-254f-428c-9fc2-c37b2e224b54" (UID: "c5d6795c-254f-428c-9fc2-c37b2e224b54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.429603 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-utilities" (OuterVolumeSpecName: "utilities") pod "c89065be-d4d7-4201-b4fd-f1bc18df6a60" (UID: "c89065be-d4d7-4201-b4fd-f1bc18df6a60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.432617 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d6795c-254f-428c-9fc2-c37b2e224b54-kube-api-access-qtvx2" (OuterVolumeSpecName: "kube-api-access-qtvx2") pod "c5d6795c-254f-428c-9fc2-c37b2e224b54" (UID: "c5d6795c-254f-428c-9fc2-c37b2e224b54"). InnerVolumeSpecName "kube-api-access-qtvx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.442433 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c89065be-d4d7-4201-b4fd-f1bc18df6a60-kube-api-access-vqwm5" (OuterVolumeSpecName: "kube-api-access-vqwm5") pod "c89065be-d4d7-4201-b4fd-f1bc18df6a60" (UID: "c89065be-d4d7-4201-b4fd-f1bc18df6a60"). InnerVolumeSpecName "kube-api-access-vqwm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.459928 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c89065be-d4d7-4201-b4fd-f1bc18df6a60" (UID: "c89065be-d4d7-4201-b4fd-f1bc18df6a60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.490247 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.505348 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5d6795c-254f-428c-9fc2-c37b2e224b54" (UID: "c5d6795c-254f-428c-9fc2-c37b2e224b54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.529071 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-trusted-ca\") pod \"ca945ba8-363c-4e60-b11a-6938e4cb9354\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.529159 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm89x\" (UniqueName: \"kubernetes.io/projected/ca945ba8-363c-4e60-b11a-6938e4cb9354-kube-api-access-dm89x\") pod \"ca945ba8-363c-4e60-b11a-6938e4cb9354\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.529189 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-operator-metrics\") pod \"ca945ba8-363c-4e60-b11a-6938e4cb9354\" (UID: \"ca945ba8-363c-4e60-b11a-6938e4cb9354\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.529465 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.529482 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c89065be-d4d7-4201-b4fd-f1bc18df6a60-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.529492 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.529501 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqwm5\" (UniqueName: \"kubernetes.io/projected/c89065be-d4d7-4201-b4fd-f1bc18df6a60-kube-api-access-vqwm5\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.529511 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtvx2\" (UniqueName: \"kubernetes.io/projected/c5d6795c-254f-428c-9fc2-c37b2e224b54-kube-api-access-qtvx2\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.529519 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5d6795c-254f-428c-9fc2-c37b2e224b54-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.530081 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ca945ba8-363c-4e60-b11a-6938e4cb9354" (UID: "ca945ba8-363c-4e60-b11a-6938e4cb9354"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.533237 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa" path="/var/lib/kubelet/pods/2a3d78d9-765c-4ee2-8ecc-ff3cca7c6daa/volumes" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.533774 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ca945ba8-363c-4e60-b11a-6938e4cb9354" (UID: "ca945ba8-363c-4e60-b11a-6938e4cb9354"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.534026 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba637ae5-45df-40ef-bf20-9a66e079197e" path="/var/lib/kubelet/pods/ba637ae5-45df-40ef-bf20-9a66e079197e/volumes" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.534717 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d097d82c-1ab8-4c6e-80e2-71dcb09ac05f" path="/var/lib/kubelet/pods/d097d82c-1ab8-4c6e-80e2-71dcb09ac05f/volumes" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.538898 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca945ba8-363c-4e60-b11a-6938e4cb9354-kube-api-access-dm89x" (OuterVolumeSpecName: "kube-api-access-dm89x") pod "ca945ba8-363c-4e60-b11a-6938e4cb9354" (UID: "ca945ba8-363c-4e60-b11a-6938e4cb9354"). InnerVolumeSpecName "kube-api-access-dm89x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.634986 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm89x\" (UniqueName: \"kubernetes.io/projected/ca945ba8-363c-4e60-b11a-6938e4cb9354-kube-api-access-dm89x\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.635014 4728 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.635024 4728 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca945ba8-363c-4e60-b11a-6938e4cb9354-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: E1216 15:00:51.670211 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89065be_d4d7_4201_b4fd_f1bc18df6a60.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5541d34_e213_4545_af81_6410a52db88d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5d6795c_254f_428c_9fc2_c37b2e224b54.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5541d34_e213_4545_af81_6410a52db88d.slice/crio-a29edac15c726cff514496e4484b8f4f5c3e2858fdbbed5651bf1ce543bfa626\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5d6795c_254f_428c_9fc2_c37b2e224b54.slice/crio-b930c8141146b334971e96d95434527a03670ae2a2b4387fd972312f715440ac\": RecentStats: unable to find data in memory cache]" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.701467 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.735796 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-utilities\") pod \"e74a33ea-23b7-47fc-a463-566f8b579917\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.735839 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-catalog-content\") pod \"e74a33ea-23b7-47fc-a463-566f8b579917\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.735897 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbxdf\" (UniqueName: \"kubernetes.io/projected/e74a33ea-23b7-47fc-a463-566f8b579917-kube-api-access-pbxdf\") pod \"e74a33ea-23b7-47fc-a463-566f8b579917\" (UID: \"e74a33ea-23b7-47fc-a463-566f8b579917\") " Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.736792 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-utilities" (OuterVolumeSpecName: "utilities") pod "e74a33ea-23b7-47fc-a463-566f8b579917" (UID: "e74a33ea-23b7-47fc-a463-566f8b579917"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.739712 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e74a33ea-23b7-47fc-a463-566f8b579917-kube-api-access-pbxdf" (OuterVolumeSpecName: "kube-api-access-pbxdf") pod "e74a33ea-23b7-47fc-a463-566f8b579917" (UID: "e74a33ea-23b7-47fc-a463-566f8b579917"). InnerVolumeSpecName "kube-api-access-pbxdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.837809 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.837836 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbxdf\" (UniqueName: \"kubernetes.io/projected/e74a33ea-23b7-47fc-a463-566f8b579917-kube-api-access-pbxdf\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.857268 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e74a33ea-23b7-47fc-a463-566f8b579917" (UID: "e74a33ea-23b7-47fc-a463-566f8b579917"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:00:51 crc kubenswrapper[4728]: I1216 15:00:51.938792 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e74a33ea-23b7-47fc-a463-566f8b579917-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.024509 4728 generic.go:334] "Generic (PLEG): container finished" podID="e74a33ea-23b7-47fc-a463-566f8b579917" containerID="8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6" exitCode=0 Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.024597 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vw5z" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.024597 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vw5z" event={"ID":"e74a33ea-23b7-47fc-a463-566f8b579917","Type":"ContainerDied","Data":"8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6"} Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.024657 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vw5z" event={"ID":"e74a33ea-23b7-47fc-a463-566f8b579917","Type":"ContainerDied","Data":"46699ea4e925835ddc8acfedf187576712037b7401ba4a42cc75b59e8e2648b2"} Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.024676 4728 scope.go:117] "RemoveContainer" containerID="8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.026042 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" event={"ID":"ca945ba8-363c-4e60-b11a-6938e4cb9354","Type":"ContainerDied","Data":"369ffc81a7c153229455658439ebc6525c9ad8ed9561d510db94d7020fd86991"} Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.026061 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8k6z5" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.027950 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" event={"ID":"26fa192c-181f-41a5-9e6e-cfa5defa2e56","Type":"ContainerStarted","Data":"d53e9897483990f0ae2cd75c6ba6e21a78ecf91244d5ec662441e7fa41d96559"} Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.028002 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" event={"ID":"26fa192c-181f-41a5-9e6e-cfa5defa2e56","Type":"ContainerStarted","Data":"c0d35edf9aeff8adebfb2518fba7fcb12d14fb757327d7855d8216a20396bcb8"} Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.028102 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.037118 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgppn" event={"ID":"c5d6795c-254f-428c-9fc2-c37b2e224b54","Type":"ContainerDied","Data":"b930c8141146b334971e96d95434527a03670ae2a2b4387fd972312f715440ac"} Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.037200 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgppn" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.039796 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" event={"ID":"54f118e6-46e7-4cd4-84fc-491746adedb2","Type":"ContainerStarted","Data":"4571a806ff9725f426b302fd4afb96142a6d59928aa6626c17d96508b37e4fcb"} Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.041225 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.041874 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" event={"ID":"28557b66-a02a-4c9e-880f-3d9f21e5892b","Type":"ContainerStarted","Data":"f19d22c7fdfe355249c852995cedd3ccdab63e9e9b677e71240b6d1aab259f1a"} Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.041909 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" event={"ID":"28557b66-a02a-4c9e-880f-3d9f21e5892b","Type":"ContainerStarted","Data":"5ab7d9742f26a4671a8d2fdc06bff6691cf34ef3019e4e3279ed1eb16247fe90"} Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.042545 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.044746 4728 scope.go:117] "RemoveContainer" containerID="feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.048059 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78qzz" event={"ID":"d5541d34-e213-4545-af81-6410a52db88d","Type":"ContainerDied","Data":"a29edac15c726cff514496e4484b8f4f5c3e2858fdbbed5651bf1ce543bfa626"} Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.048087 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-78qzz" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.050683 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.050903 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cb68986fd-zrkh2" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.053653 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxlrz" event={"ID":"c89065be-d4d7-4201-b4fd-f1bc18df6a60","Type":"ContainerDied","Data":"17b14ce578a6fdec235d42d21ddfffc03c5827aaa3b49f8b8244f19f4f939058"} Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.053736 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxlrz" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.060832 4728 scope.go:117] "RemoveContainer" containerID="9c78c4646e45d64fb8e623fad6ace1d43baf7101d4ea728abee59f2d1546e59d" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.076497 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-585785d5f4-9l9dw" podStartSLOduration=6.076481343 podStartE2EDuration="6.076481343s" podCreationTimestamp="2025-12-16 15:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:52.053344682 +0000 UTC m=+232.893523666" watchObservedRunningTime="2025-12-16 15:00:52.076481343 +0000 UTC m=+232.916660327" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.082915 4728 scope.go:117] "RemoveContainer" containerID="8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.083399 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6\": container with ID starting with 8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6 not found: ID does not exist" containerID="8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.083478 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6"} err="failed to get container status \"8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6\": rpc error: code = NotFound desc = could not find container \"8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6\": container with ID starting with 8c4f31ba4f66f1cfd9955a81b0dfa44657c88576bcf0cac1adbf298b0fa4cfd6 not found: ID does not exist" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.083497 4728 scope.go:117] "RemoveContainer" containerID="feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.084292 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7\": container with ID starting with feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7 not found: ID does not exist" containerID="feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.084353 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7"} err="failed to get container status \"feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7\": rpc error: code = NotFound desc = could not find container \"feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7\": container with ID starting with feecb7dee18a950766f48213c7db52b97f04d948d67d7681ff50ff12dc82dde7 not found: ID does not exist" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.084370 4728 scope.go:117] "RemoveContainer" containerID="9c78c4646e45d64fb8e623fad6ace1d43baf7101d4ea728abee59f2d1546e59d" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.084672 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c78c4646e45d64fb8e623fad6ace1d43baf7101d4ea728abee59f2d1546e59d\": container with ID starting with 9c78c4646e45d64fb8e623fad6ace1d43baf7101d4ea728abee59f2d1546e59d not found: ID does not exist" containerID="9c78c4646e45d64fb8e623fad6ace1d43baf7101d4ea728abee59f2d1546e59d" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.084715 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c78c4646e45d64fb8e623fad6ace1d43baf7101d4ea728abee59f2d1546e59d"} err="failed to get container status \"9c78c4646e45d64fb8e623fad6ace1d43baf7101d4ea728abee59f2d1546e59d\": rpc error: code = NotFound desc = could not find container \"9c78c4646e45d64fb8e623fad6ace1d43baf7101d4ea728abee59f2d1546e59d\": container with ID starting with 9c78c4646e45d64fb8e623fad6ace1d43baf7101d4ea728abee59f2d1546e59d not found: ID does not exist" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.084731 4728 scope.go:117] "RemoveContainer" containerID="3f7db6888f46974d186c856ac3e6417cc401141d6d4952e0fc6f96412525d753" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.100729 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k6z5"] Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.109998 4728 scope.go:117] "RemoveContainer" containerID="8f2abbf346de87e3d9e27aa179154aabf71a697abedd8b573eac53cd71bc9d1d" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.110141 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k6z5"] Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.119436 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-78qzz"] Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.124709 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-78qzz"] Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.126611 4728 scope.go:117] "RemoveContainer" containerID="a1a04c3d312fa4b3ee2c5c2de1af60c5c59c6f2d639e6836598ff5d1a7f2e2cb" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.127398 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rgppn"] Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.131478 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rgppn"] Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.139118 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6vw5z"] Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.142624 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6vw5z"] Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.145235 4728 scope.go:117] "RemoveContainer" containerID="c748d69a2439a0f9c53d84e0ab65d50b185532bbab20e210a06b1d69b610e1b4" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.155234 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dhkkk" podStartSLOduration=2.155212964 podStartE2EDuration="2.155212964s" podCreationTimestamp="2025-12-16 15:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:00:52.151973141 +0000 UTC m=+232.992152125" watchObservedRunningTime="2025-12-16 15:00:52.155212964 +0000 UTC m=+232.995391958" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.160493 4728 scope.go:117] "RemoveContainer" containerID="a691516666a2d18185742e5597d9ba7b7b07b0b3fe5c2225eec93cbffde2feb2" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.187307 4728 scope.go:117] "RemoveContainer" containerID="deac880a9a055f0fe191962c30f98cd232afd7bf438f6bcf66f21eaa1e144559" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.205165 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxlrz"] Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.207883 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxlrz"] Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.211246 4728 scope.go:117] "RemoveContainer" containerID="bad6df5493f9eaa82024fc4ece34b0f8e49ba896e9adc0aef31b2a69ccea7bbe" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.232634 4728 scope.go:117] "RemoveContainer" containerID="9eb8c21a19472ffec46d107bf6767cf77b9eac889d419d27c130ea70bc11c2a8" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.249506 4728 scope.go:117] "RemoveContainer" containerID="f0c68431491303007bca2545a9558391c24c9cee44e80cd1bb80be3962852403" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.260496 4728 scope.go:117] "RemoveContainer" containerID="560a2330fd10ae2a1ca2d67b32d59081811a8de6d20cecea3666d64ab7e253e5" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950580 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7wwrk"] Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950772 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74a33ea-23b7-47fc-a463-566f8b579917" containerName="extract-utilities" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950784 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74a33ea-23b7-47fc-a463-566f8b579917" containerName="extract-utilities" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950796 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5541d34-e213-4545-af81-6410a52db88d" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950802 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5541d34-e213-4545-af81-6410a52db88d" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950811 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5541d34-e213-4545-af81-6410a52db88d" containerName="extract-content" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950817 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5541d34-e213-4545-af81-6410a52db88d" containerName="extract-content" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950824 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca945ba8-363c-4e60-b11a-6938e4cb9354" containerName="marketplace-operator" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950829 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca945ba8-363c-4e60-b11a-6938e4cb9354" containerName="marketplace-operator" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950837 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d6795c-254f-428c-9fc2-c37b2e224b54" containerName="extract-utilities" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950843 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d6795c-254f-428c-9fc2-c37b2e224b54" containerName="extract-utilities" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950850 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74a33ea-23b7-47fc-a463-566f8b579917" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950856 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74a33ea-23b7-47fc-a463-566f8b579917" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950863 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950868 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950876 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74a33ea-23b7-47fc-a463-566f8b579917" containerName="extract-content" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950881 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74a33ea-23b7-47fc-a463-566f8b579917" containerName="extract-content" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950890 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d6795c-254f-428c-9fc2-c37b2e224b54" containerName="extract-content" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950895 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d6795c-254f-428c-9fc2-c37b2e224b54" containerName="extract-content" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950903 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5541d34-e213-4545-af81-6410a52db88d" containerName="extract-utilities" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950908 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5541d34-e213-4545-af81-6410a52db88d" containerName="extract-utilities" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950918 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" containerName="extract-content" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950924 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" containerName="extract-content" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950934 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" containerName="extract-utilities" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950940 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" containerName="extract-utilities" Dec 16 15:00:52 crc kubenswrapper[4728]: E1216 15:00:52.950946 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d6795c-254f-428c-9fc2-c37b2e224b54" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.950951 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d6795c-254f-428c-9fc2-c37b2e224b54" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.951032 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e74a33ea-23b7-47fc-a463-566f8b579917" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.951040 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5541d34-e213-4545-af81-6410a52db88d" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.951049 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d6795c-254f-428c-9fc2-c37b2e224b54" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.951056 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" containerName="registry-server" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.951063 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca945ba8-363c-4e60-b11a-6938e4cb9354" containerName="marketplace-operator" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.951954 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.961075 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7wwrk"] Dec 16 15:00:52 crc kubenswrapper[4728]: I1216 15:00:52.961656 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.058312 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8v55\" (UniqueName: \"kubernetes.io/projected/001d33fe-6bb7-4554-919c-e990321a2590-kube-api-access-n8v55\") pod \"community-operators-7wwrk\" (UID: \"001d33fe-6bb7-4554-919c-e990321a2590\") " pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.058359 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001d33fe-6bb7-4554-919c-e990321a2590-catalog-content\") pod \"community-operators-7wwrk\" (UID: \"001d33fe-6bb7-4554-919c-e990321a2590\") " pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.058480 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001d33fe-6bb7-4554-919c-e990321a2590-utilities\") pod \"community-operators-7wwrk\" (UID: \"001d33fe-6bb7-4554-919c-e990321a2590\") " pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.151769 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jh2cv"] Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.153340 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.155879 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.162518 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001d33fe-6bb7-4554-919c-e990321a2590-utilities\") pod \"community-operators-7wwrk\" (UID: \"001d33fe-6bb7-4554-919c-e990321a2590\") " pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.162615 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8v55\" (UniqueName: \"kubernetes.io/projected/001d33fe-6bb7-4554-919c-e990321a2590-kube-api-access-n8v55\") pod \"community-operators-7wwrk\" (UID: \"001d33fe-6bb7-4554-919c-e990321a2590\") " pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.162669 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001d33fe-6bb7-4554-919c-e990321a2590-catalog-content\") pod \"community-operators-7wwrk\" (UID: \"001d33fe-6bb7-4554-919c-e990321a2590\") " pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.163230 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001d33fe-6bb7-4554-919c-e990321a2590-utilities\") pod \"community-operators-7wwrk\" (UID: \"001d33fe-6bb7-4554-919c-e990321a2590\") " pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.163706 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001d33fe-6bb7-4554-919c-e990321a2590-catalog-content\") pod \"community-operators-7wwrk\" (UID: \"001d33fe-6bb7-4554-919c-e990321a2590\") " pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.169045 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jh2cv"] Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.180364 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8v55\" (UniqueName: \"kubernetes.io/projected/001d33fe-6bb7-4554-919c-e990321a2590-kube-api-access-n8v55\") pod \"community-operators-7wwrk\" (UID: \"001d33fe-6bb7-4554-919c-e990321a2590\") " pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.264532 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrlm5\" (UniqueName: \"kubernetes.io/projected/91e5f218-48b8-47f0-825c-f9eea263b64c-kube-api-access-xrlm5\") pod \"redhat-operators-jh2cv\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.264729 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-utilities\") pod \"redhat-operators-jh2cv\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.264783 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-catalog-content\") pod \"redhat-operators-jh2cv\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.273143 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.366549 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-utilities\") pod \"redhat-operators-jh2cv\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.366638 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-catalog-content\") pod \"redhat-operators-jh2cv\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.366729 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrlm5\" (UniqueName: \"kubernetes.io/projected/91e5f218-48b8-47f0-825c-f9eea263b64c-kube-api-access-xrlm5\") pod \"redhat-operators-jh2cv\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.367053 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-utilities\") pod \"redhat-operators-jh2cv\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.367383 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-catalog-content\") pod \"redhat-operators-jh2cv\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.382632 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrlm5\" (UniqueName: \"kubernetes.io/projected/91e5f218-48b8-47f0-825c-f9eea263b64c-kube-api-access-xrlm5\") pod \"redhat-operators-jh2cv\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.472858 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.512837 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d6795c-254f-428c-9fc2-c37b2e224b54" path="/var/lib/kubelet/pods/c5d6795c-254f-428c-9fc2-c37b2e224b54/volumes" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.513425 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c89065be-d4d7-4201-b4fd-f1bc18df6a60" path="/var/lib/kubelet/pods/c89065be-d4d7-4201-b4fd-f1bc18df6a60/volumes" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.514028 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca945ba8-363c-4e60-b11a-6938e4cb9354" path="/var/lib/kubelet/pods/ca945ba8-363c-4e60-b11a-6938e4cb9354/volumes" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.515036 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5541d34-e213-4545-af81-6410a52db88d" path="/var/lib/kubelet/pods/d5541d34-e213-4545-af81-6410a52db88d/volumes" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.515600 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e74a33ea-23b7-47fc-a463-566f8b579917" path="/var/lib/kubelet/pods/e74a33ea-23b7-47fc-a463-566f8b579917/volumes" Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.703704 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7wwrk"] Dec 16 15:00:53 crc kubenswrapper[4728]: W1216 15:00:53.711822 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod001d33fe_6bb7_4554_919c_e990321a2590.slice/crio-9fd809673d1e69e9405b2649dad091908d23557df07f694b70586f8f8f27b8f0 WatchSource:0}: Error finding container 9fd809673d1e69e9405b2649dad091908d23557df07f694b70586f8f8f27b8f0: Status 404 returned error can't find the container with id 9fd809673d1e69e9405b2649dad091908d23557df07f694b70586f8f8f27b8f0 Dec 16 15:00:53 crc kubenswrapper[4728]: I1216 15:00:53.860611 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jh2cv"] Dec 16 15:00:53 crc kubenswrapper[4728]: W1216 15:00:53.924529 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91e5f218_48b8_47f0_825c_f9eea263b64c.slice/crio-76b3d3c71bf47175c17d5aa259ba2e89d946431ffa72506ee8e91155eab412ce WatchSource:0}: Error finding container 76b3d3c71bf47175c17d5aa259ba2e89d946431ffa72506ee8e91155eab412ce: Status 404 returned error can't find the container with id 76b3d3c71bf47175c17d5aa259ba2e89d946431ffa72506ee8e91155eab412ce Dec 16 15:00:54 crc kubenswrapper[4728]: I1216 15:00:54.070286 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh2cv" event={"ID":"91e5f218-48b8-47f0-825c-f9eea263b64c","Type":"ContainerStarted","Data":"82d0513e5b16abca1fed5d6bf9b535a054b809c4ed37bdc8952518b055efe58d"} Dec 16 15:00:54 crc kubenswrapper[4728]: I1216 15:00:54.070331 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh2cv" event={"ID":"91e5f218-48b8-47f0-825c-f9eea263b64c","Type":"ContainerStarted","Data":"76b3d3c71bf47175c17d5aa259ba2e89d946431ffa72506ee8e91155eab412ce"} Dec 16 15:00:54 crc kubenswrapper[4728]: I1216 15:00:54.073071 4728 generic.go:334] "Generic (PLEG): container finished" podID="001d33fe-6bb7-4554-919c-e990321a2590" containerID="12e43d54fd17a69fa23987f4f3026b3e2902ddc8f0329bc4971fbfaf50713ae9" exitCode=0 Dec 16 15:00:54 crc kubenswrapper[4728]: I1216 15:00:54.073914 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wwrk" event={"ID":"001d33fe-6bb7-4554-919c-e990321a2590","Type":"ContainerDied","Data":"12e43d54fd17a69fa23987f4f3026b3e2902ddc8f0329bc4971fbfaf50713ae9"} Dec 16 15:00:54 crc kubenswrapper[4728]: I1216 15:00:54.073936 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wwrk" event={"ID":"001d33fe-6bb7-4554-919c-e990321a2590","Type":"ContainerStarted","Data":"9fd809673d1e69e9405b2649dad091908d23557df07f694b70586f8f8f27b8f0"} Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.083031 4728 generic.go:334] "Generic (PLEG): container finished" podID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerID="82d0513e5b16abca1fed5d6bf9b535a054b809c4ed37bdc8952518b055efe58d" exitCode=0 Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.083113 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh2cv" event={"ID":"91e5f218-48b8-47f0-825c-f9eea263b64c","Type":"ContainerDied","Data":"82d0513e5b16abca1fed5d6bf9b535a054b809c4ed37bdc8952518b055efe58d"} Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.348545 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pln52"] Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.349486 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.354218 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.357700 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pln52"] Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.389117 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-catalog-content\") pod \"certified-operators-pln52\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.389240 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-utilities\") pod \"certified-operators-pln52\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.389273 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkkbm\" (UniqueName: \"kubernetes.io/projected/12774c70-805e-47d0-9c1f-e0b59a4f9d06-kube-api-access-qkkbm\") pod \"certified-operators-pln52\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.490346 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-utilities\") pod \"certified-operators-pln52\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.490437 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkkbm\" (UniqueName: \"kubernetes.io/projected/12774c70-805e-47d0-9c1f-e0b59a4f9d06-kube-api-access-qkkbm\") pod \"certified-operators-pln52\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.490471 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-catalog-content\") pod \"certified-operators-pln52\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.491142 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-utilities\") pod \"certified-operators-pln52\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.491162 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-catalog-content\") pod \"certified-operators-pln52\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.510599 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkkbm\" (UniqueName: \"kubernetes.io/projected/12774c70-805e-47d0-9c1f-e0b59a4f9d06-kube-api-access-qkkbm\") pod \"certified-operators-pln52\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.551297 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5t4gg"] Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.552369 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.554604 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.564806 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5t4gg"] Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.592312 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c35eada5-7775-4d8c-92e3-c744b7f223a1-utilities\") pod \"redhat-marketplace-5t4gg\" (UID: \"c35eada5-7775-4d8c-92e3-c744b7f223a1\") " pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.592458 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c35eada5-7775-4d8c-92e3-c744b7f223a1-catalog-content\") pod \"redhat-marketplace-5t4gg\" (UID: \"c35eada5-7775-4d8c-92e3-c744b7f223a1\") " pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.592517 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhnf2\" (UniqueName: \"kubernetes.io/projected/c35eada5-7775-4d8c-92e3-c744b7f223a1-kube-api-access-zhnf2\") pod \"redhat-marketplace-5t4gg\" (UID: \"c35eada5-7775-4d8c-92e3-c744b7f223a1\") " pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.669789 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.694095 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c35eada5-7775-4d8c-92e3-c744b7f223a1-utilities\") pod \"redhat-marketplace-5t4gg\" (UID: \"c35eada5-7775-4d8c-92e3-c744b7f223a1\") " pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.694198 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c35eada5-7775-4d8c-92e3-c744b7f223a1-catalog-content\") pod \"redhat-marketplace-5t4gg\" (UID: \"c35eada5-7775-4d8c-92e3-c744b7f223a1\") " pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.694247 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhnf2\" (UniqueName: \"kubernetes.io/projected/c35eada5-7775-4d8c-92e3-c744b7f223a1-kube-api-access-zhnf2\") pod \"redhat-marketplace-5t4gg\" (UID: \"c35eada5-7775-4d8c-92e3-c744b7f223a1\") " pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.694281 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c35eada5-7775-4d8c-92e3-c744b7f223a1-utilities\") pod \"redhat-marketplace-5t4gg\" (UID: \"c35eada5-7775-4d8c-92e3-c744b7f223a1\") " pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.694490 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c35eada5-7775-4d8c-92e3-c744b7f223a1-catalog-content\") pod \"redhat-marketplace-5t4gg\" (UID: \"c35eada5-7775-4d8c-92e3-c744b7f223a1\") " pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.719266 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhnf2\" (UniqueName: \"kubernetes.io/projected/c35eada5-7775-4d8c-92e3-c744b7f223a1-kube-api-access-zhnf2\") pod \"redhat-marketplace-5t4gg\" (UID: \"c35eada5-7775-4d8c-92e3-c744b7f223a1\") " pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:00:55 crc kubenswrapper[4728]: I1216 15:00:55.875445 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:00:56 crc kubenswrapper[4728]: I1216 15:00:56.077556 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pln52"] Dec 16 15:00:56 crc kubenswrapper[4728]: I1216 15:00:56.094808 4728 generic.go:334] "Generic (PLEG): container finished" podID="001d33fe-6bb7-4554-919c-e990321a2590" containerID="7ca039cca0042b93db7b67a07c3a18ab8d3630ba8b419ce92f06bbb1b2be2896" exitCode=0 Dec 16 15:00:56 crc kubenswrapper[4728]: I1216 15:00:56.094854 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wwrk" event={"ID":"001d33fe-6bb7-4554-919c-e990321a2590","Type":"ContainerDied","Data":"7ca039cca0042b93db7b67a07c3a18ab8d3630ba8b419ce92f06bbb1b2be2896"} Dec 16 15:00:56 crc kubenswrapper[4728]: I1216 15:00:56.283277 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5t4gg"] Dec 16 15:00:56 crc kubenswrapper[4728]: W1216 15:00:56.376457 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc35eada5_7775_4d8c_92e3_c744b7f223a1.slice/crio-97511c264eb20cfc68710451f7c4cab4a70c57b1b12ac2cf1ab8712de5d9f0c4 WatchSource:0}: Error finding container 97511c264eb20cfc68710451f7c4cab4a70c57b1b12ac2cf1ab8712de5d9f0c4: Status 404 returned error can't find the container with id 97511c264eb20cfc68710451f7c4cab4a70c57b1b12ac2cf1ab8712de5d9f0c4 Dec 16 15:00:57 crc kubenswrapper[4728]: I1216 15:00:57.102424 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh2cv" event={"ID":"91e5f218-48b8-47f0-825c-f9eea263b64c","Type":"ContainerStarted","Data":"7d09660bad10ffad1a6739e5a428f0ca506033f5eb5ee6c8b81678bc28fa2f36"} Dec 16 15:00:57 crc kubenswrapper[4728]: I1216 15:00:57.104016 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5t4gg" event={"ID":"c35eada5-7775-4d8c-92e3-c744b7f223a1","Type":"ContainerStarted","Data":"99c3956b77d4152321ff6ca4e4e69167c519d244355a3dd13e59dd4fd7eaa998"} Dec 16 15:00:57 crc kubenswrapper[4728]: I1216 15:00:57.104044 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5t4gg" event={"ID":"c35eada5-7775-4d8c-92e3-c744b7f223a1","Type":"ContainerStarted","Data":"97511c264eb20cfc68710451f7c4cab4a70c57b1b12ac2cf1ab8712de5d9f0c4"} Dec 16 15:00:57 crc kubenswrapper[4728]: I1216 15:00:57.106642 4728 generic.go:334] "Generic (PLEG): container finished" podID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" containerID="6d18d29d0e0431f38ad8b1f897bd68de792ecf1d07f9631b0fbe2a120ed22d18" exitCode=0 Dec 16 15:00:57 crc kubenswrapper[4728]: I1216 15:00:57.106674 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pln52" event={"ID":"12774c70-805e-47d0-9c1f-e0b59a4f9d06","Type":"ContainerDied","Data":"6d18d29d0e0431f38ad8b1f897bd68de792ecf1d07f9631b0fbe2a120ed22d18"} Dec 16 15:00:57 crc kubenswrapper[4728]: I1216 15:00:57.106691 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pln52" event={"ID":"12774c70-805e-47d0-9c1f-e0b59a4f9d06","Type":"ContainerStarted","Data":"9d9f6bc56e9b9cf80a210b06584d30c7c5de1c6abb9e44bb7499d0a27a718d0f"} Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.113000 4728 generic.go:334] "Generic (PLEG): container finished" podID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerID="7d09660bad10ffad1a6739e5a428f0ca506033f5eb5ee6c8b81678bc28fa2f36" exitCode=0 Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.113156 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh2cv" event={"ID":"91e5f218-48b8-47f0-825c-f9eea263b64c","Type":"ContainerDied","Data":"7d09660bad10ffad1a6739e5a428f0ca506033f5eb5ee6c8b81678bc28fa2f36"} Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.118918 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7wwrk" event={"ID":"001d33fe-6bb7-4554-919c-e990321a2590","Type":"ContainerStarted","Data":"bf9fe8eb8dafb7007fa3186f22c8e9c11a71bf3e165ee1f413258a04dccd6e9d"} Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.121167 4728 generic.go:334] "Generic (PLEG): container finished" podID="c35eada5-7775-4d8c-92e3-c744b7f223a1" containerID="99c3956b77d4152321ff6ca4e4e69167c519d244355a3dd13e59dd4fd7eaa998" exitCode=0 Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.121213 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5t4gg" event={"ID":"c35eada5-7775-4d8c-92e3-c744b7f223a1","Type":"ContainerDied","Data":"99c3956b77d4152321ff6ca4e4e69167c519d244355a3dd13e59dd4fd7eaa998"} Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.123247 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pln52" event={"ID":"12774c70-805e-47d0-9c1f-e0b59a4f9d06","Type":"ContainerStarted","Data":"e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95"} Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.187241 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7wwrk" podStartSLOduration=2.881562442 podStartE2EDuration="6.187219907s" podCreationTimestamp="2025-12-16 15:00:52 +0000 UTC" firstStartedPulling="2025-12-16 15:00:54.074733813 +0000 UTC m=+234.914912837" lastFinishedPulling="2025-12-16 15:00:57.380391318 +0000 UTC m=+238.220570302" observedRunningTime="2025-12-16 15:00:58.185298728 +0000 UTC m=+239.025477712" watchObservedRunningTime="2025-12-16 15:00:58.187219907 +0000 UTC m=+239.027398911" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.485617 4728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.486286 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.486347 4728 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.486872 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14" gracePeriod=15 Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.486932 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90" gracePeriod=15 Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.486956 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b" gracePeriod=15 Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.486896 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687" gracePeriod=15 Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.486936 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2" gracePeriod=15 Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.487498 4728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 15:00:58 crc kubenswrapper[4728]: E1216 15:00:58.487648 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.487663 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 16 15:00:58 crc kubenswrapper[4728]: E1216 15:00:58.487671 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.487677 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 15:00:58 crc kubenswrapper[4728]: E1216 15:00:58.487687 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.487694 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 15:00:58 crc kubenswrapper[4728]: E1216 15:00:58.487703 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.487710 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 15:00:58 crc kubenswrapper[4728]: E1216 15:00:58.487719 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.487727 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 15:00:58 crc kubenswrapper[4728]: E1216 15:00:58.487735 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.487796 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 15:00:58 crc kubenswrapper[4728]: E1216 15:00:58.487820 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.487861 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.487980 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.487993 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.488003 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.488014 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.488022 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.488193 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.519793 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.529555 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.529601 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.529623 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.529649 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.529671 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.529694 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.529781 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.529937 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.630981 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631306 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631345 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631364 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631384 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631432 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631394 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631452 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631153 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631458 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631519 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631518 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631554 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631531 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631531 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.631663 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: I1216 15:00:58.810568 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:00:58 crc kubenswrapper[4728]: W1216 15:00:58.829157 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-a9f0c5500bcc8487cfd2a61a3b2ea727ffaf2ea69ad40906f5c9f62b60667908 WatchSource:0}: Error finding container a9f0c5500bcc8487cfd2a61a3b2ea727ffaf2ea69ad40906f5c9f62b60667908: Status 404 returned error can't find the container with id a9f0c5500bcc8487cfd2a61a3b2ea727ffaf2ea69ad40906f5c9f62b60667908 Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.132077 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.133289 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.133949 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687" exitCode=0 Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.133972 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2" exitCode=0 Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.133980 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90" exitCode=0 Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.133986 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b" exitCode=2 Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.134038 4728 scope.go:117] "RemoveContainer" containerID="b6356912a2e476b884c22f41d4357ceda5b0036302fd78a1e8f62d0c735931d2" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.137657 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh2cv" event={"ID":"91e5f218-48b8-47f0-825c-f9eea263b64c","Type":"ContainerStarted","Data":"26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6"} Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.138331 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.138670 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.138911 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.140484 4728 generic.go:334] "Generic (PLEG): container finished" podID="42661dad-bece-4f43-9621-9c04d54ecb5c" containerID="ef96257eca7490f60c46d116791afdb7b453659a8af1ddd04a380412763a291b" exitCode=0 Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.140556 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"42661dad-bece-4f43-9621-9c04d54ecb5c","Type":"ContainerDied","Data":"ef96257eca7490f60c46d116791afdb7b453659a8af1ddd04a380412763a291b"} Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.141219 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.141518 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.141769 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.141912 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"233967acac0c91bddd4b6ae2b5089ca6520a2f51f40521cdaed11d977eefdd42"} Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.141943 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a9f0c5500bcc8487cfd2a61a3b2ea727ffaf2ea69ad40906f5c9f62b60667908"} Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.142013 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.142235 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.142394 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.142563 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.142708 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.143768 4728 generic.go:334] "Generic (PLEG): container finished" podID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" containerID="e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95" exitCode=0 Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.143807 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pln52" event={"ID":"12774c70-805e-47d0-9c1f-e0b59a4f9d06","Type":"ContainerDied","Data":"e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95"} Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.144285 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.144607 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.145049 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.145284 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.145547 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.508257 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.508537 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.508734 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.508982 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:00:59 crc kubenswrapper[4728]: I1216 15:00:59.509609 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:00 crc kubenswrapper[4728]: I1216 15:01:00.154246 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.163650 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"42661dad-bece-4f43-9621-9c04d54ecb5c","Type":"ContainerDied","Data":"8ae96622b989611f0ad04c8f26e5cde0d718f621f50feeb65440004d60b5053b"} Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.164022 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ae96622b989611f0ad04c8f26e5cde0d718f621f50feeb65440004d60b5053b" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.396596 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.397204 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.397598 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.398005 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.398343 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.480091 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42661dad-bece-4f43-9621-9c04d54ecb5c-kube-api-access\") pod \"42661dad-bece-4f43-9621-9c04d54ecb5c\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.480154 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-kubelet-dir\") pod \"42661dad-bece-4f43-9621-9c04d54ecb5c\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.480180 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-var-lock\") pod \"42661dad-bece-4f43-9621-9c04d54ecb5c\" (UID: \"42661dad-bece-4f43-9621-9c04d54ecb5c\") " Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.480248 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "42661dad-bece-4f43-9621-9c04d54ecb5c" (UID: "42661dad-bece-4f43-9621-9c04d54ecb5c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.480324 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-var-lock" (OuterVolumeSpecName: "var-lock") pod "42661dad-bece-4f43-9621-9c04d54ecb5c" (UID: "42661dad-bece-4f43-9621-9c04d54ecb5c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.480548 4728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-var-lock\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.480567 4728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42661dad-bece-4f43-9621-9c04d54ecb5c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.484546 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42661dad-bece-4f43-9621-9c04d54ecb5c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "42661dad-bece-4f43-9621-9c04d54ecb5c" (UID: "42661dad-bece-4f43-9621-9c04d54ecb5c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:01:01 crc kubenswrapper[4728]: I1216 15:01:01.581305 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42661dad-bece-4f43-9621-9c04d54ecb5c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:01 crc kubenswrapper[4728]: E1216 15:01:01.581452 4728 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.210:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" volumeName="registry-storage" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.136821 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.137738 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.138125 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.138429 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.138708 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.138920 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.139595 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.169040 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.170575 4728 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14" exitCode=0 Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.170684 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.170697 4728 scope.go:117] "RemoveContainer" containerID="b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.175827 4728 generic.go:334] "Generic (PLEG): container finished" podID="c35eada5-7775-4d8c-92e3-c744b7f223a1" containerID="ce2c82bed3ae433dd45ee635cfa0a4f0fb1561cef002256e108d72da3f4922f2" exitCode=0 Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.175867 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5t4gg" event={"ID":"c35eada5-7775-4d8c-92e3-c744b7f223a1","Type":"ContainerDied","Data":"ce2c82bed3ae433dd45ee635cfa0a4f0fb1561cef002256e108d72da3f4922f2"} Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.176468 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.182660 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.183062 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.183288 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.183689 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.183906 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.188802 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.191888 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pln52" event={"ID":"12774c70-805e-47d0-9c1f-e0b59a4f9d06","Type":"ContainerStarted","Data":"da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac"} Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.192011 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.192282 4728 scope.go:117] "RemoveContainer" containerID="fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.192530 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.192886 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.193867 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.194849 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.195544 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.197239 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.197478 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.197845 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.198934 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.199143 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.199370 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.208195 4728 scope.go:117] "RemoveContainer" containerID="316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.229828 4728 scope.go:117] "RemoveContainer" containerID="1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.244206 4728 scope.go:117] "RemoveContainer" containerID="970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.257138 4728 scope.go:117] "RemoveContainer" containerID="afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.277545 4728 scope.go:117] "RemoveContainer" containerID="b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.278038 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\": container with ID starting with b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687 not found: ID does not exist" containerID="b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.278083 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687"} err="failed to get container status \"b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\": rpc error: code = NotFound desc = could not find container \"b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687\": container with ID starting with b6dca8810efc389ddf362768bfe1c63cb49ed3644397d89461e52cd7378db687 not found: ID does not exist" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.278112 4728 scope.go:117] "RemoveContainer" containerID="fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.278425 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\": container with ID starting with fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2 not found: ID does not exist" containerID="fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.278464 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2"} err="failed to get container status \"fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\": rpc error: code = NotFound desc = could not find container \"fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2\": container with ID starting with fdcac05b7c5f190f0d2cd591e55160fee1ca5da8713dd325bdecc6b4a893b6e2 not found: ID does not exist" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.278489 4728 scope.go:117] "RemoveContainer" containerID="316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.278864 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\": container with ID starting with 316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90 not found: ID does not exist" containerID="316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.278888 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90"} err="failed to get container status \"316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\": rpc error: code = NotFound desc = could not find container \"316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90\": container with ID starting with 316452da63b0cd5162ca2119afba462854d4162e4715ffb6ae86e7802082ed90 not found: ID does not exist" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.278904 4728 scope.go:117] "RemoveContainer" containerID="1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.279380 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\": container with ID starting with 1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b not found: ID does not exist" containerID="1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.279430 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b"} err="failed to get container status \"1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\": rpc error: code = NotFound desc = could not find container \"1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b\": container with ID starting with 1cad80e03ab3bd798901a7d356e3eb074cc01ddec8113b2b9960ef6bc0299c3b not found: ID does not exist" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.279457 4728 scope.go:117] "RemoveContainer" containerID="970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.279904 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\": container with ID starting with 970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14 not found: ID does not exist" containerID="970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.279933 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14"} err="failed to get container status \"970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\": rpc error: code = NotFound desc = could not find container \"970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14\": container with ID starting with 970f09a5effbd2eb411483ca5d3a20790243d969708d8149855ed966ed859e14 not found: ID does not exist" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.279952 4728 scope.go:117] "RemoveContainer" containerID="afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.280296 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\": container with ID starting with afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2 not found: ID does not exist" containerID="afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.280331 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2"} err="failed to get container status \"afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\": rpc error: code = NotFound desc = could not find container \"afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2\": container with ID starting with afe62c1b874afe67614d473dc9d2df9af0a3372b57f701dfb42c3679d511e0b2 not found: ID does not exist" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.293776 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.293884 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.293891 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.293905 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.293935 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.294002 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.294368 4728 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.294386 4728 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.294396 4728 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.489601 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.490379 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.490897 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.491184 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.491652 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.491950 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.742962 4728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.743359 4728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.743905 4728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.744219 4728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.744515 4728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:02 crc kubenswrapper[4728]: I1216 15:01:02.744551 4728 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.744856 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="200ms" Dec 16 15:01:02 crc kubenswrapper[4728]: E1216 15:01:02.946065 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="400ms" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.196232 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5t4gg" event={"ID":"c35eada5-7775-4d8c-92e3-c744b7f223a1","Type":"ContainerStarted","Data":"6513189559e4a1fb4da7f7d985be6454682c1d06fb827e756b3713f7afa3c047"} Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.197127 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.197596 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.197935 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.198196 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.198456 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.198706 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.273859 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.274128 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.318242 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.318768 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.319144 4728 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.319612 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.319882 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.320098 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.320308 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.321682 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:03 crc kubenswrapper[4728]: E1216 15:01:03.346658 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="800ms" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.474479 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.474519 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:01:03 crc kubenswrapper[4728]: I1216 15:01:03.513887 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 16 15:01:03 crc kubenswrapper[4728]: E1216 15:01:03.538026 4728 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.210:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-jh2cv.1881ba3354e957b2 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-jh2cv,UID:91e5f218-48b8-47f0-825c-f9eea263b64c,APIVersion:v1,ResourceVersion:29943,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 421ms (421ms including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 15:00:58.536622002 +0000 UTC m=+239.376800986,LastTimestamp:2025-12-16 15:00:58.536622002 +0000 UTC m=+239.376800986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 15:01:04 crc kubenswrapper[4728]: E1216 15:01:04.147378 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="1.6s" Dec 16 15:01:04 crc kubenswrapper[4728]: I1216 15:01:04.256004 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7wwrk" Dec 16 15:01:04 crc kubenswrapper[4728]: I1216 15:01:04.256575 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:04 crc kubenswrapper[4728]: I1216 15:01:04.256815 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:04 crc kubenswrapper[4728]: I1216 15:01:04.257008 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:04 crc kubenswrapper[4728]: I1216 15:01:04.257181 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:04 crc kubenswrapper[4728]: I1216 15:01:04.257359 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:04 crc kubenswrapper[4728]: I1216 15:01:04.257626 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:04 crc kubenswrapper[4728]: I1216 15:01:04.512370 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jh2cv" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerName="registry-server" probeResult="failure" output=< Dec 16 15:01:04 crc kubenswrapper[4728]: timeout: failed to connect service ":50051" within 1s Dec 16 15:01:04 crc kubenswrapper[4728]: > Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.670652 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.670744 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.723153 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.723949 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.724624 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.725478 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.726022 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.726390 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.727102 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:05 crc kubenswrapper[4728]: E1216 15:01:05.748508 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="3.2s" Dec 16 15:01:05 crc kubenswrapper[4728]: E1216 15:01:05.771071 4728 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.210:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-jh2cv.1881ba3354e957b2 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-jh2cv,UID:91e5f218-48b8-47f0-825c-f9eea263b64c,APIVersion:v1,ResourceVersion:29943,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 421ms (421ms including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-16 15:00:58.536622002 +0000 UTC m=+239.376800986,LastTimestamp:2025-12-16 15:00:58.536622002 +0000 UTC m=+239.376800986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.876464 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.877109 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.935948 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.936720 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.937327 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.938896 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.939372 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.939966 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:05 crc kubenswrapper[4728]: I1216 15:01:05.940626 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:06 crc kubenswrapper[4728]: I1216 15:01:06.272956 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pln52" Dec 16 15:01:06 crc kubenswrapper[4728]: I1216 15:01:06.273704 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:06 crc kubenswrapper[4728]: I1216 15:01:06.274148 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:06 crc kubenswrapper[4728]: I1216 15:01:06.274644 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:06 crc kubenswrapper[4728]: I1216 15:01:06.274909 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:06 crc kubenswrapper[4728]: I1216 15:01:06.275169 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:06 crc kubenswrapper[4728]: I1216 15:01:06.275466 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:08 crc kubenswrapper[4728]: E1216 15:01:08.950158 4728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="6.4s" Dec 16 15:01:09 crc kubenswrapper[4728]: I1216 15:01:09.511979 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:09 crc kubenswrapper[4728]: I1216 15:01:09.512521 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:09 crc kubenswrapper[4728]: I1216 15:01:09.512905 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:09 crc kubenswrapper[4728]: I1216 15:01:09.513473 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:09 crc kubenswrapper[4728]: I1216 15:01:09.514181 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:09 crc kubenswrapper[4728]: I1216 15:01:09.514710 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:11 crc kubenswrapper[4728]: I1216 15:01:11.506369 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:11 crc kubenswrapper[4728]: I1216 15:01:11.507733 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:11 crc kubenswrapper[4728]: I1216 15:01:11.508305 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:11 crc kubenswrapper[4728]: I1216 15:01:11.509472 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:11 crc kubenswrapper[4728]: I1216 15:01:11.510049 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:11 crc kubenswrapper[4728]: I1216 15:01:11.510680 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:11 crc kubenswrapper[4728]: I1216 15:01:11.511222 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:11 crc kubenswrapper[4728]: I1216 15:01:11.530161 4728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="008cd71a-a642-43c5-8aa2-98db283d9c45" Dec 16 15:01:11 crc kubenswrapper[4728]: I1216 15:01:11.530217 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="008cd71a-a642-43c5-8aa2-98db283d9c45" Dec 16 15:01:11 crc kubenswrapper[4728]: E1216 15:01:11.539948 4728 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:11 crc kubenswrapper[4728]: I1216 15:01:11.540742 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:12 crc kubenswrapper[4728]: I1216 15:01:12.132325 4728 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 16 15:01:12 crc kubenswrapper[4728]: I1216 15:01:12.132842 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 16 15:01:12 crc kubenswrapper[4728]: I1216 15:01:12.248401 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"098ce1cb93b9b4c62c4a8eaa0f6510eff1aa569d48e8de9d4832fcc66e855077"} Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.005366 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" podUID="eeff725e-9dab-4bec-99f6-8105af9b3b6c" containerName="oauth-openshift" containerID="cri-o://21dcef68631d91aba594e87a446fa92562917775907ea91526f24b0d87c170ec" gracePeriod=15 Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.524513 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.525156 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.525773 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.529729 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.536608 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.538286 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.538576 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.570225 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.570842 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.571294 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.571614 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.571887 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.572149 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:13 crc kubenswrapper[4728]: I1216 15:01:13.572442 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.262277 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.262798 4728 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d" exitCode=1 Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.262918 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d"} Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.263925 4728 scope.go:117] "RemoveContainer" containerID="2e1c1c748da91eb90843845417878cbc56a93dbcee08ecf45f03f90eebf9835d" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.264211 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.266484 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.266667 4728 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3f5cf2ac5d140139945913cbb3074ace38d1158250df20cad4c541ede21e98d4" exitCode=0 Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.266845 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3f5cf2ac5d140139945913cbb3074ace38d1158250df20cad4c541ede21e98d4"} Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.267146 4728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="008cd71a-a642-43c5-8aa2-98db283d9c45" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.267188 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="008cd71a-a642-43c5-8aa2-98db283d9c45" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.267190 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: E1216 15:01:14.267702 4728 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.267708 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.268174 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.268557 4728 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.269006 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.269458 4728 generic.go:334] "Generic (PLEG): container finished" podID="eeff725e-9dab-4bec-99f6-8105af9b3b6c" containerID="21dcef68631d91aba594e87a446fa92562917775907ea91526f24b0d87c170ec" exitCode=0 Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.269553 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" event={"ID":"eeff725e-9dab-4bec-99f6-8105af9b3b6c","Type":"ContainerDied","Data":"21dcef68631d91aba594e87a446fa92562917775907ea91526f24b0d87c170ec"} Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.269789 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.270273 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.270653 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.271092 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.271651 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.271989 4728 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.272479 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.315338 4728 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-h6f6v container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.315493 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" podUID="eeff725e-9dab-4bec-99f6-8105af9b3b6c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.671731 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.672214 4728 status_manager.go:851] "Failed to get status for pod" podUID="c35eada5-7775-4d8c-92e3-c744b7f223a1" pod="openshift-marketplace/redhat-marketplace-5t4gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-5t4gg\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.672389 4728 status_manager.go:851] "Failed to get status for pod" podUID="eeff725e-9dab-4bec-99f6-8105af9b3b6c" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h6f6v\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.672648 4728 status_manager.go:851] "Failed to get status for pod" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" pod="openshift-marketplace/certified-operators-pln52" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-pln52\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.673043 4728 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.673227 4728 status_manager.go:851] "Failed to get status for pod" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" pod="openshift-marketplace/redhat-operators-jh2cv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jh2cv\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.673397 4728 status_manager.go:851] "Failed to get status for pod" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.673672 4728 status_manager.go:851] "Failed to get status for pod" podUID="001d33fe-6bb7-4554-919c-e990321a2590" pod="openshift-marketplace/community-operators-7wwrk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-7wwrk\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.673900 4728 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.753952 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-policies\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.753994 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-serving-cert\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.754018 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-provider-selection\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.754039 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-dir\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.754059 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-trusted-ca-bundle\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.754078 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-ocp-branding-template\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.754102 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-session\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.754156 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.754751 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.754762 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.754978 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-login\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.755018 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-cliconfig\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.755039 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-error\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.755073 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-service-ca\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.755098 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzlzn\" (UniqueName: \"kubernetes.io/projected/eeff725e-9dab-4bec-99f6-8105af9b3b6c-kube-api-access-jzlzn\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.755117 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-router-certs\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.755144 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-idp-0-file-data\") pod \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\" (UID: \"eeff725e-9dab-4bec-99f6-8105af9b3b6c\") " Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.755295 4728 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.755309 4728 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eeff725e-9dab-4bec-99f6-8105af9b3b6c-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.755320 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.756585 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.756671 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.759927 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.760247 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.760360 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.760524 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.760720 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.760754 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.761184 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.762870 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeff725e-9dab-4bec-99f6-8105af9b3b6c-kube-api-access-jzlzn" (OuterVolumeSpecName: "kube-api-access-jzlzn") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "kube-api-access-jzlzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.762958 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "eeff725e-9dab-4bec-99f6-8105af9b3b6c" (UID: "eeff725e-9dab-4bec-99f6-8105af9b3b6c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.855921 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.856090 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.856152 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.856209 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzlzn\" (UniqueName: \"kubernetes.io/projected/eeff725e-9dab-4bec-99f6-8105af9b3b6c-kube-api-access-jzlzn\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.856264 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.856344 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.856479 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.856552 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.856612 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.856666 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:14 crc kubenswrapper[4728]: I1216 15:01:14.856722 4728 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eeff725e-9dab-4bec-99f6-8105af9b3b6c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:15 crc kubenswrapper[4728]: I1216 15:01:15.287804 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 16 15:01:15 crc kubenswrapper[4728]: I1216 15:01:15.288195 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fa2f07e1fcd4d1b2f4c20a6c11decf4f7a49ac2e2ebbc73f4ec8ea936f764b96"} Dec 16 15:01:15 crc kubenswrapper[4728]: I1216 15:01:15.291861 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"edec67ae2f6991c11372a1c8745c617fe0c8cad5abdc9ee059b978bd1a946e7b"} Dec 16 15:01:15 crc kubenswrapper[4728]: I1216 15:01:15.291911 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9d2c3bb5e79d3a0c240b2a21f32879365516cde4d90791b40cba511f1c824af9"} Dec 16 15:01:15 crc kubenswrapper[4728]: I1216 15:01:15.291926 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"439bf760d04fdb00f71dbe88cf1c7798616bfa9e5cba6e7dc413a0b0ea50db3b"} Dec 16 15:01:15 crc kubenswrapper[4728]: I1216 15:01:15.293856 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" event={"ID":"eeff725e-9dab-4bec-99f6-8105af9b3b6c","Type":"ContainerDied","Data":"7a130af95c875f838ba20efd5b85bc31edf44d9f3c3be4e2ad2cd49be0bf6996"} Dec 16 15:01:15 crc kubenswrapper[4728]: I1216 15:01:15.293895 4728 scope.go:117] "RemoveContainer" containerID="21dcef68631d91aba594e87a446fa92562917775907ea91526f24b0d87c170ec" Dec 16 15:01:15 crc kubenswrapper[4728]: I1216 15:01:15.294033 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h6f6v" Dec 16 15:01:15 crc kubenswrapper[4728]: I1216 15:01:15.346222 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 15:01:15 crc kubenswrapper[4728]: I1216 15:01:15.534188 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 15:01:15 crc kubenswrapper[4728]: I1216 15:01:15.930273 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5t4gg" Dec 16 15:01:16 crc kubenswrapper[4728]: I1216 15:01:16.302429 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b284c9865c01007d526e687898697883d3dcbed35c18f298de7bce7b10729ed7"} Dec 16 15:01:16 crc kubenswrapper[4728]: I1216 15:01:16.302480 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5e93b829a3c811465e7c701e7d28fe03328546e14577bde2068043d4b350419d"} Dec 16 15:01:16 crc kubenswrapper[4728]: I1216 15:01:16.302612 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:16 crc kubenswrapper[4728]: I1216 15:01:16.302702 4728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="008cd71a-a642-43c5-8aa2-98db283d9c45" Dec 16 15:01:16 crc kubenswrapper[4728]: I1216 15:01:16.302729 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="008cd71a-a642-43c5-8aa2-98db283d9c45" Dec 16 15:01:16 crc kubenswrapper[4728]: I1216 15:01:16.303638 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 15:01:16 crc kubenswrapper[4728]: I1216 15:01:16.541793 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:16 crc kubenswrapper[4728]: I1216 15:01:16.541866 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:16 crc kubenswrapper[4728]: I1216 15:01:16.550748 4728 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]log ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]etcd ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/generic-apiserver-start-informers ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/priority-and-fairness-filter ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/start-apiextensions-informers ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/start-apiextensions-controllers ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/crd-informer-synced ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/start-system-namespaces-controller ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 16 15:01:16 crc kubenswrapper[4728]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 16 15:01:16 crc kubenswrapper[4728]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/bootstrap-controller ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/start-kube-aggregator-informers ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/apiservice-registration-controller ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/apiservice-discovery-controller ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]autoregister-completion ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/apiservice-openapi-controller ok Dec 16 15:01:16 crc kubenswrapper[4728]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 16 15:01:16 crc kubenswrapper[4728]: livez check failed Dec 16 15:01:16 crc kubenswrapper[4728]: I1216 15:01:16.550866 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 15:01:21 crc kubenswrapper[4728]: I1216 15:01:21.312862 4728 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:21 crc kubenswrapper[4728]: I1216 15:01:21.460796 4728 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="92e87824-fab9-4cdc-b3e6-77395da18e02" Dec 16 15:01:22 crc kubenswrapper[4728]: I1216 15:01:22.338668 4728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="008cd71a-a642-43c5-8aa2-98db283d9c45" Dec 16 15:01:22 crc kubenswrapper[4728]: I1216 15:01:22.338994 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="008cd71a-a642-43c5-8aa2-98db283d9c45" Dec 16 15:01:22 crc kubenswrapper[4728]: I1216 15:01:22.341458 4728 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="92e87824-fab9-4cdc-b3e6-77395da18e02" Dec 16 15:01:28 crc kubenswrapper[4728]: I1216 15:01:28.109034 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 16 15:01:30 crc kubenswrapper[4728]: I1216 15:01:30.868861 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 16 15:01:31 crc kubenswrapper[4728]: I1216 15:01:31.264817 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 16 15:01:31 crc kubenswrapper[4728]: I1216 15:01:31.502023 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 16 15:01:31 crc kubenswrapper[4728]: I1216 15:01:31.908762 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 16 15:01:32 crc kubenswrapper[4728]: I1216 15:01:32.121907 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 16 15:01:32 crc kubenswrapper[4728]: I1216 15:01:32.174674 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 16 15:01:32 crc kubenswrapper[4728]: I1216 15:01:32.415309 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 16 15:01:32 crc kubenswrapper[4728]: I1216 15:01:32.719572 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 16 15:01:33 crc kubenswrapper[4728]: I1216 15:01:33.008776 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 16 15:01:33 crc kubenswrapper[4728]: I1216 15:01:33.039106 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 16 15:01:33 crc kubenswrapper[4728]: I1216 15:01:33.046745 4728 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 16 15:01:33 crc kubenswrapper[4728]: I1216 15:01:33.054831 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 16 15:01:33 crc kubenswrapper[4728]: I1216 15:01:33.123265 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 16 15:01:33 crc kubenswrapper[4728]: I1216 15:01:33.850993 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.003428 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.016938 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.049225 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.082094 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.289785 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.328848 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.608722 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.672832 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.692338 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.769463 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.770632 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.770894 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.821952 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.897017 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.919904 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 16 15:01:34 crc kubenswrapper[4728]: I1216 15:01:34.977544 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.016980 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.022869 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.050505 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.177515 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.220547 4728 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.221080 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pln52" podStartSLOduration=35.671239447 podStartE2EDuration="40.221060868s" podCreationTimestamp="2025-12-16 15:00:55 +0000 UTC" firstStartedPulling="2025-12-16 15:00:57.123758234 +0000 UTC m=+237.963937218" lastFinishedPulling="2025-12-16 15:01:01.673579655 +0000 UTC m=+242.513758639" observedRunningTime="2025-12-16 15:01:21.255100991 +0000 UTC m=+262.095279995" watchObservedRunningTime="2025-12-16 15:01:35.221060868 +0000 UTC m=+276.061239872" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.221239 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jh2cv" podStartSLOduration=38.779960578 podStartE2EDuration="42.221231803s" podCreationTimestamp="2025-12-16 15:00:53 +0000 UTC" firstStartedPulling="2025-12-16 15:00:55.095336377 +0000 UTC m=+235.935515401" lastFinishedPulling="2025-12-16 15:00:58.536607642 +0000 UTC m=+239.376786626" observedRunningTime="2025-12-16 15:01:21.294695013 +0000 UTC m=+262.134874057" watchObservedRunningTime="2025-12-16 15:01:35.221231803 +0000 UTC m=+276.061410797" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.222855 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5t4gg" podStartSLOduration=35.434574078 podStartE2EDuration="40.222843006s" podCreationTimestamp="2025-12-16 15:00:55 +0000 UTC" firstStartedPulling="2025-12-16 15:00:58.12295118 +0000 UTC m=+238.963130164" lastFinishedPulling="2025-12-16 15:01:02.911220108 +0000 UTC m=+243.751399092" observedRunningTime="2025-12-16 15:01:21.415175036 +0000 UTC m=+262.255354020" watchObservedRunningTime="2025-12-16 15:01:35.222843006 +0000 UTC m=+276.063022040" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.223539 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.224487 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=37.224474331 podStartE2EDuration="37.224474331s" podCreationTimestamp="2025-12-16 15:00:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:01:21.273783047 +0000 UTC m=+262.113962071" watchObservedRunningTime="2025-12-16 15:01:35.224474331 +0000 UTC m=+276.064653335" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.226360 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h6f6v","openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.226444 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.231571 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.237583 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.261349 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.267443 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.267399573 podStartE2EDuration="14.267399573s" podCreationTimestamp="2025-12-16 15:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:01:35.245590902 +0000 UTC m=+276.085769896" watchObservedRunningTime="2025-12-16 15:01:35.267399573 +0000 UTC m=+276.107578567" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.268600 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.328376 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.407373 4728 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.414042 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.460447 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.512710 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeff725e-9dab-4bec-99f6-8105af9b3b6c" path="/var/lib/kubelet/pods/eeff725e-9dab-4bec-99f6-8105af9b3b6c/volumes" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.555858 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.559949 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.567839 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.719542 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.749058 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.785596 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.789637 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 16 15:01:35 crc kubenswrapper[4728]: I1216 15:01:35.996002 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.004871 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.005993 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.048784 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.218669 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.244319 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.297756 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.322185 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.380961 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.414702 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.433772 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.543281 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.544366 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.547901 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.553921 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.607500 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.675163 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.684697 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.763042 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.822970 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.886953 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.940728 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 16 15:01:36 crc kubenswrapper[4728]: I1216 15:01:36.993130 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.108024 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.117293 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.158951 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.194678 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.304800 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.608327 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.685358 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.709936 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.782573 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.789957 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.919973 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 16 15:01:37 crc kubenswrapper[4728]: I1216 15:01:37.944164 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.046819 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.052000 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.071105 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.143782 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.219905 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.279521 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.427773 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.428264 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.523675 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.525119 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.590671 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.612256 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.627540 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.653993 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.669963 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.691006 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.761641 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.790141 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.845257 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.885932 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.986586 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 16 15:01:38 crc kubenswrapper[4728]: I1216 15:01:38.996270 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.102631 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.211010 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.270163 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.489927 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.540501 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.568903 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.576963 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.678426 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.679567 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.682891 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.691754 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.726799 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.741393 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.760285 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.894240 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 16 15:01:39 crc kubenswrapper[4728]: I1216 15:01:39.944353 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.001924 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.045397 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.083452 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.145848 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.243216 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.318024 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.409737 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.416457 4728 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.488135 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.560826 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.567477 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.620885 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.668318 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.768027 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.824173 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.849650 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.892374 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.914712 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.922807 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.925628 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.944659 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.949881 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 16 15:01:40 crc kubenswrapper[4728]: I1216 15:01:40.979821 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.134142 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.145923 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.288986 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.361043 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.363497 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.392086 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.437120 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.448072 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.498193 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.580203 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.591326 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.607035 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.617604 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.617920 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.678423 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.721976 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.771340 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.826394 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.889678 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 16 15:01:41 crc kubenswrapper[4728]: I1216 15:01:41.911095 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.022843 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.032159 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.068727 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.178234 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.221834 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.272420 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.355305 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.460201 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.625096 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.670057 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.730150 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.807033 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.816385 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.880921 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 16 15:01:42 crc kubenswrapper[4728]: I1216 15:01:42.889678 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.044155 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.155547 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.218093 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.356913 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.383571 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.543287 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.622790 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.753809 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.775384 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.807785 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.879012 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.914265 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.942081 4728 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.942779 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://233967acac0c91bddd4b6ae2b5089ca6520a2f51f40521cdaed11d977eefdd42" gracePeriod=5 Dec 16 15:01:43 crc kubenswrapper[4728]: I1216 15:01:43.971832 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.093955 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.098754 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.248920 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.286153 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.287368 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.298178 4728 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.397148 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.407135 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.436998 4728 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.443737 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.505075 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.606189 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.613961 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.661288 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.727508 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.878801 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.980512 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-59c6b49d48-mbth5"] Dec 16 15:01:44 crc kubenswrapper[4728]: E1216 15:01:44.980745 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeff725e-9dab-4bec-99f6-8105af9b3b6c" containerName="oauth-openshift" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.980759 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeff725e-9dab-4bec-99f6-8105af9b3b6c" containerName="oauth-openshift" Dec 16 15:01:44 crc kubenswrapper[4728]: E1216 15:01:44.980777 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.980785 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 15:01:44 crc kubenswrapper[4728]: E1216 15:01:44.980801 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" containerName="installer" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.980810 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" containerName="installer" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.980913 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.980927 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="42661dad-bece-4f43-9621-9c04d54ecb5c" containerName="installer" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.980942 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeff725e-9dab-4bec-99f6-8105af9b3b6c" containerName="oauth-openshift" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.981330 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.985986 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.986041 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.986144 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.986400 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.987757 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.987921 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.988044 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.992549 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.992859 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.993103 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.997159 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 16 15:01:44 crc kubenswrapper[4728]: I1216 15:01:44.997467 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.001273 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-59c6b49d48-mbth5"] Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.016520 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.017175 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.024355 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.045813 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.106751 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164256 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164339 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-audit-policies\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164392 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164498 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164543 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164590 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-router-certs\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164627 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-service-ca\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164677 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-template-login\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164712 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164743 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-template-error\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164773 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcq85\" (UniqueName: \"kubernetes.io/projected/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-kube-api-access-xcq85\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.164949 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-audit-dir\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.165105 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-session\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.165158 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.184951 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266258 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266349 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-audit-policies\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266399 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266478 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266512 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266554 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-router-certs\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266590 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-service-ca\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266638 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-template-login\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266667 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266700 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-template-error\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266733 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcq85\" (UniqueName: \"kubernetes.io/projected/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-kube-api-access-xcq85\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266778 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-audit-dir\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266835 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-session\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.266870 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.267206 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-audit-policies\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.267712 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.267802 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-audit-dir\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.271785 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-template-error\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.272529 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-service-ca\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.272561 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-template-login\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.273343 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.273600 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.273816 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.274430 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.274602 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-session\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.275483 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-system-router-certs\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.277681 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.286759 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcq85\" (UniqueName: \"kubernetes.io/projected/0687e8eb-29f2-4e81-bd68-f5c466c75ea1-kube-api-access-xcq85\") pod \"oauth-openshift-59c6b49d48-mbth5\" (UID: \"0687e8eb-29f2-4e81-bd68-f5c466c75ea1\") " pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.305698 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.311233 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.344741 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.379026 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.580874 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.722970 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-59c6b49d48-mbth5"] Dec 16 15:01:45 crc kubenswrapper[4728]: I1216 15:01:45.873184 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.127324 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.149345 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.321480 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.422436 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.456937 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.489661 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" event={"ID":"0687e8eb-29f2-4e81-bd68-f5c466c75ea1","Type":"ContainerStarted","Data":"a5261e6379b7f18d6b7d13a2d3c5c13b0d186bc4482f089b245ddbe3a0055c79"} Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.489709 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" event={"ID":"0687e8eb-29f2-4e81-bd68-f5c466c75ea1","Type":"ContainerStarted","Data":"33f89a7c91c445101d7a6a3f1835be4a2db6b40f6765283f6213ae51206db937"} Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.489956 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.518785 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" podStartSLOduration=58.518763757 podStartE2EDuration="58.518763757s" podCreationTimestamp="2025-12-16 15:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:01:46.515722223 +0000 UTC m=+287.355901207" watchObservedRunningTime="2025-12-16 15:01:46.518763757 +0000 UTC m=+287.358942751" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.554192 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.589131 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-59c6b49d48-mbth5" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.600723 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.798243 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 16 15:01:46 crc kubenswrapper[4728]: I1216 15:01:46.944814 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 16 15:01:47 crc kubenswrapper[4728]: I1216 15:01:47.101791 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 16 15:01:47 crc kubenswrapper[4728]: I1216 15:01:47.240828 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.513029 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.514865 4728 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="233967acac0c91bddd4b6ae2b5089ca6520a2f51f40521cdaed11d977eefdd42" exitCode=137 Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.515107 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9f0c5500bcc8487cfd2a61a3b2ea727ffaf2ea69ad40906f5c9f62b60667908" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.537369 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.537800 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.728343 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.728575 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.728637 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.728692 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.728798 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.728844 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.728895 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.728955 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.729142 4728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.729158 4728 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.729170 4728 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.730167 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.741334 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.831576 4728 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:49 crc kubenswrapper[4728]: I1216 15:01:49.831646 4728 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 16 15:01:50 crc kubenswrapper[4728]: I1216 15:01:50.522575 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 16 15:01:51 crc kubenswrapper[4728]: I1216 15:01:51.520050 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 16 15:01:51 crc kubenswrapper[4728]: I1216 15:01:51.520576 4728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 16 15:01:51 crc kubenswrapper[4728]: I1216 15:01:51.538666 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 15:01:51 crc kubenswrapper[4728]: I1216 15:01:51.538734 4728 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="2b0e07ed-fb59-46fa-9ff2-04bf36c5f1ec" Dec 16 15:01:51 crc kubenswrapper[4728]: I1216 15:01:51.545177 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 16 15:01:51 crc kubenswrapper[4728]: I1216 15:01:51.545239 4728 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="2b0e07ed-fb59-46fa-9ff2-04bf36c5f1ec" Dec 16 15:01:55 crc kubenswrapper[4728]: I1216 15:01:55.913819 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 16 15:01:59 crc kubenswrapper[4728]: I1216 15:01:59.330905 4728 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 16 15:02:02 crc kubenswrapper[4728]: I1216 15:02:02.472450 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 16 15:02:11 crc kubenswrapper[4728]: I1216 15:02:11.044724 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 16 15:02:14 crc kubenswrapper[4728]: I1216 15:02:14.579674 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 16 15:02:14 crc kubenswrapper[4728]: I1216 15:02:14.963141 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 16 15:02:15 crc kubenswrapper[4728]: I1216 15:02:15.759316 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 16 15:02:19 crc kubenswrapper[4728]: I1216 15:02:19.471255 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 16 15:02:19 crc kubenswrapper[4728]: I1216 15:02:19.935011 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 16 15:02:22 crc kubenswrapper[4728]: I1216 15:02:22.795005 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 16 15:02:22 crc kubenswrapper[4728]: I1216 15:02:22.899784 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sksfs"] Dec 16 15:02:22 crc kubenswrapper[4728]: I1216 15:02:22.900485 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:22 crc kubenswrapper[4728]: I1216 15:02:22.914345 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sksfs"] Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.097615 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfb6f123-ab50-4490-af74-4c5926d0417d-registry-certificates\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.097676 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfb6f123-ab50-4490-af74-4c5926d0417d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.097702 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp5xm\" (UniqueName: \"kubernetes.io/projected/cfb6f123-ab50-4490-af74-4c5926d0417d-kube-api-access-wp5xm\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.097751 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfb6f123-ab50-4490-af74-4c5926d0417d-trusted-ca\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.097848 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.098004 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfb6f123-ab50-4490-af74-4c5926d0417d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.098076 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfb6f123-ab50-4490-af74-4c5926d0417d-bound-sa-token\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.098193 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfb6f123-ab50-4490-af74-4c5926d0417d-registry-tls\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.123912 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.200901 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfb6f123-ab50-4490-af74-4c5926d0417d-trusted-ca\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.200994 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfb6f123-ab50-4490-af74-4c5926d0417d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.201146 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfb6f123-ab50-4490-af74-4c5926d0417d-bound-sa-token\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.201261 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfb6f123-ab50-4490-af74-4c5926d0417d-registry-tls\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.201329 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfb6f123-ab50-4490-af74-4c5926d0417d-registry-certificates\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.201369 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfb6f123-ab50-4490-af74-4c5926d0417d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.201432 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp5xm\" (UniqueName: \"kubernetes.io/projected/cfb6f123-ab50-4490-af74-4c5926d0417d-kube-api-access-wp5xm\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.202386 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfb6f123-ab50-4490-af74-4c5926d0417d-trusted-ca\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.203960 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfb6f123-ab50-4490-af74-4c5926d0417d-registry-certificates\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.205328 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfb6f123-ab50-4490-af74-4c5926d0417d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.212037 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfb6f123-ab50-4490-af74-4c5926d0417d-registry-tls\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.212510 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfb6f123-ab50-4490-af74-4c5926d0417d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.220885 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp5xm\" (UniqueName: \"kubernetes.io/projected/cfb6f123-ab50-4490-af74-4c5926d0417d-kube-api-access-wp5xm\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.224031 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfb6f123-ab50-4490-af74-4c5926d0417d-bound-sa-token\") pod \"image-registry-66df7c8f76-sksfs\" (UID: \"cfb6f123-ab50-4490-af74-4c5926d0417d\") " pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.231649 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.710359 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sksfs"] Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.763385 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" event={"ID":"cfb6f123-ab50-4490-af74-4c5926d0417d","Type":"ContainerStarted","Data":"3c84d04f105be83ab8b72eb83e17fb5f4aa3e77aaa59a6899e2115250d32f1eb"} Dec 16 15:02:23 crc kubenswrapper[4728]: I1216 15:02:23.886909 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 16 15:02:24 crc kubenswrapper[4728]: I1216 15:02:24.771977 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" event={"ID":"cfb6f123-ab50-4490-af74-4c5926d0417d","Type":"ContainerStarted","Data":"0a7e10841746b629d61301b030ff1ed4c71707c89204e03e904083e938b98735"} Dec 16 15:02:24 crc kubenswrapper[4728]: I1216 15:02:24.773154 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:24 crc kubenswrapper[4728]: I1216 15:02:24.803703 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" podStartSLOduration=2.803672984 podStartE2EDuration="2.803672984s" podCreationTimestamp="2025-12-16 15:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:02:24.803240892 +0000 UTC m=+325.643419916" watchObservedRunningTime="2025-12-16 15:02:24.803672984 +0000 UTC m=+325.643852008" Dec 16 15:02:43 crc kubenswrapper[4728]: I1216 15:02:43.238030 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-sksfs" Dec 16 15:02:43 crc kubenswrapper[4728]: I1216 15:02:43.304616 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m9gpr"] Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.358154 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" podUID="269fe7e0-633b-41d4-8a8f-cd39424229e4" containerName="registry" containerID="cri-o://66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461" gracePeriod=30 Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.802753 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.812236 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-certificates\") pod \"269fe7e0-633b-41d4-8a8f-cd39424229e4\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.812287 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-tls\") pod \"269fe7e0-633b-41d4-8a8f-cd39424229e4\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.812315 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/269fe7e0-633b-41d4-8a8f-cd39424229e4-ca-trust-extracted\") pod \"269fe7e0-633b-41d4-8a8f-cd39424229e4\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.812336 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/269fe7e0-633b-41d4-8a8f-cd39424229e4-installation-pull-secrets\") pod \"269fe7e0-633b-41d4-8a8f-cd39424229e4\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.812352 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-trusted-ca\") pod \"269fe7e0-633b-41d4-8a8f-cd39424229e4\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.812391 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-bound-sa-token\") pod \"269fe7e0-633b-41d4-8a8f-cd39424229e4\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.812531 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"269fe7e0-633b-41d4-8a8f-cd39424229e4\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.812555 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwzvp\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-kube-api-access-pwzvp\") pod \"269fe7e0-633b-41d4-8a8f-cd39424229e4\" (UID: \"269fe7e0-633b-41d4-8a8f-cd39424229e4\") " Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.814637 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "269fe7e0-633b-41d4-8a8f-cd39424229e4" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.814664 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "269fe7e0-633b-41d4-8a8f-cd39424229e4" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.818326 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.818390 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.823486 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269fe7e0-633b-41d4-8a8f-cd39424229e4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "269fe7e0-633b-41d4-8a8f-cd39424229e4" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.831765 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "269fe7e0-633b-41d4-8a8f-cd39424229e4" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.832697 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "269fe7e0-633b-41d4-8a8f-cd39424229e4" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.840463 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-kube-api-access-pwzvp" (OuterVolumeSpecName: "kube-api-access-pwzvp") pod "269fe7e0-633b-41d4-8a8f-cd39424229e4" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4"). InnerVolumeSpecName "kube-api-access-pwzvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.840879 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "269fe7e0-633b-41d4-8a8f-cd39424229e4" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.851902 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/269fe7e0-633b-41d4-8a8f-cd39424229e4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "269fe7e0-633b-41d4-8a8f-cd39424229e4" (UID: "269fe7e0-633b-41d4-8a8f-cd39424229e4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.914083 4728 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.914122 4728 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.914137 4728 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/269fe7e0-633b-41d4-8a8f-cd39424229e4-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.914149 4728 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/269fe7e0-633b-41d4-8a8f-cd39424229e4-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.914163 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/269fe7e0-633b-41d4-8a8f-cd39424229e4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.914199 4728 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 16 15:03:08 crc kubenswrapper[4728]: I1216 15:03:08.914214 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwzvp\" (UniqueName: \"kubernetes.io/projected/269fe7e0-633b-41d4-8a8f-cd39424229e4-kube-api-access-pwzvp\") on node \"crc\" DevicePath \"\"" Dec 16 15:03:09 crc kubenswrapper[4728]: I1216 15:03:09.038873 4728 generic.go:334] "Generic (PLEG): container finished" podID="269fe7e0-633b-41d4-8a8f-cd39424229e4" containerID="66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461" exitCode=0 Dec 16 15:03:09 crc kubenswrapper[4728]: I1216 15:03:09.038912 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" event={"ID":"269fe7e0-633b-41d4-8a8f-cd39424229e4","Type":"ContainerDied","Data":"66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461"} Dec 16 15:03:09 crc kubenswrapper[4728]: I1216 15:03:09.038939 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" event={"ID":"269fe7e0-633b-41d4-8a8f-cd39424229e4","Type":"ContainerDied","Data":"9c3644477a754c1cbe91fe1f55bc8a964f8775668826a8878f3423b7dbc6d001"} Dec 16 15:03:09 crc kubenswrapper[4728]: I1216 15:03:09.038954 4728 scope.go:117] "RemoveContainer" containerID="66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461" Dec 16 15:03:09 crc kubenswrapper[4728]: I1216 15:03:09.038998 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m9gpr" Dec 16 15:03:09 crc kubenswrapper[4728]: I1216 15:03:09.058362 4728 scope.go:117] "RemoveContainer" containerID="66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461" Dec 16 15:03:09 crc kubenswrapper[4728]: E1216 15:03:09.058942 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461\": container with ID starting with 66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461 not found: ID does not exist" containerID="66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461" Dec 16 15:03:09 crc kubenswrapper[4728]: I1216 15:03:09.059006 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461"} err="failed to get container status \"66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461\": rpc error: code = NotFound desc = could not find container \"66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461\": container with ID starting with 66cd52f989e06ca22ef56c515f1bfecaa9076e4930506bf443be92c6b920d461 not found: ID does not exist" Dec 16 15:03:09 crc kubenswrapper[4728]: I1216 15:03:09.084481 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m9gpr"] Dec 16 15:03:09 crc kubenswrapper[4728]: I1216 15:03:09.087869 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m9gpr"] Dec 16 15:03:09 crc kubenswrapper[4728]: I1216 15:03:09.518535 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="269fe7e0-633b-41d4-8a8f-cd39424229e4" path="/var/lib/kubelet/pods/269fe7e0-633b-41d4-8a8f-cd39424229e4/volumes" Dec 16 15:03:38 crc kubenswrapper[4728]: I1216 15:03:38.818813 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:03:38 crc kubenswrapper[4728]: I1216 15:03:38.821711 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:04:08 crc kubenswrapper[4728]: I1216 15:04:08.819344 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:04:08 crc kubenswrapper[4728]: I1216 15:04:08.820159 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:04:08 crc kubenswrapper[4728]: I1216 15:04:08.820244 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:04:08 crc kubenswrapper[4728]: I1216 15:04:08.821450 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bda00ce73e1c1ab471f206d48aed0e38d16bcd1f6b879870ad51db12f879d97"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:04:08 crc kubenswrapper[4728]: I1216 15:04:08.821596 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://4bda00ce73e1c1ab471f206d48aed0e38d16bcd1f6b879870ad51db12f879d97" gracePeriod=600 Dec 16 15:04:09 crc kubenswrapper[4728]: I1216 15:04:09.453201 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="4bda00ce73e1c1ab471f206d48aed0e38d16bcd1f6b879870ad51db12f879d97" exitCode=0 Dec 16 15:04:09 crc kubenswrapper[4728]: I1216 15:04:09.453277 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"4bda00ce73e1c1ab471f206d48aed0e38d16bcd1f6b879870ad51db12f879d97"} Dec 16 15:04:09 crc kubenswrapper[4728]: I1216 15:04:09.453925 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"5556e0d6dfe6e1666b1eb820e6992928174cc0e89be80318dfc33d104f059a37"} Dec 16 15:04:09 crc kubenswrapper[4728]: I1216 15:04:09.453956 4728 scope.go:117] "RemoveContainer" containerID="1ab482c1b92b60bfe8178749cc559a76183cd89e226c55de3736a9fe55f745ea" Dec 16 15:06:01 crc kubenswrapper[4728]: I1216 15:06:01.811692 4728 scope.go:117] "RemoveContainer" containerID="2d7de4e3b602ffeb1d763eee72dca98a9f2ba04f61fb4709bae6ae8da3450b87" Dec 16 15:06:38 crc kubenswrapper[4728]: I1216 15:06:38.821517 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:06:38 crc kubenswrapper[4728]: I1216 15:06:38.824591 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.209358 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qtpbj"] Dec 16 15:06:48 crc kubenswrapper[4728]: E1216 15:06:48.210191 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269fe7e0-633b-41d4-8a8f-cd39424229e4" containerName="registry" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.210207 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="269fe7e0-633b-41d4-8a8f-cd39424229e4" containerName="registry" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.210619 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="269fe7e0-633b-41d4-8a8f-cd39424229e4" containerName="registry" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.211260 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qtpbj" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.212885 4728 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-hzfx5" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.213504 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.214786 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.221467 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qtpbj"] Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.224719 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9x457"] Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.225572 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9x457" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.226914 4728 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-7kf27" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.262953 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9x457"] Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.268286 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xlhf9"] Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.271371 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xlhf9" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.274867 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xlhf9"] Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.274996 4728 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-h4js6" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.311883 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngftf\" (UniqueName: \"kubernetes.io/projected/98db182b-146e-48eb-918d-ff62909f62de-kube-api-access-ngftf\") pod \"cert-manager-5b446d88c5-xlhf9\" (UID: \"98db182b-146e-48eb-918d-ff62909f62de\") " pod="cert-manager/cert-manager-5b446d88c5-xlhf9" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.311935 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr5bh\" (UniqueName: \"kubernetes.io/projected/14b59c49-2ca7-4fd1-96a7-926474663fc8-kube-api-access-fr5bh\") pod \"cert-manager-webhook-5655c58dd6-9x457\" (UID: \"14b59c49-2ca7-4fd1-96a7-926474663fc8\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9x457" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.311966 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-667c4\" (UniqueName: \"kubernetes.io/projected/86fdf4d9-bff1-40f5-b1f7-7d74536c7f39-kube-api-access-667c4\") pod \"cert-manager-cainjector-7f985d654d-qtpbj\" (UID: \"86fdf4d9-bff1-40f5-b1f7-7d74536c7f39\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qtpbj" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.414234 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngftf\" (UniqueName: \"kubernetes.io/projected/98db182b-146e-48eb-918d-ff62909f62de-kube-api-access-ngftf\") pod \"cert-manager-5b446d88c5-xlhf9\" (UID: \"98db182b-146e-48eb-918d-ff62909f62de\") " pod="cert-manager/cert-manager-5b446d88c5-xlhf9" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.414326 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr5bh\" (UniqueName: \"kubernetes.io/projected/14b59c49-2ca7-4fd1-96a7-926474663fc8-kube-api-access-fr5bh\") pod \"cert-manager-webhook-5655c58dd6-9x457\" (UID: \"14b59c49-2ca7-4fd1-96a7-926474663fc8\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9x457" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.414366 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-667c4\" (UniqueName: \"kubernetes.io/projected/86fdf4d9-bff1-40f5-b1f7-7d74536c7f39-kube-api-access-667c4\") pod \"cert-manager-cainjector-7f985d654d-qtpbj\" (UID: \"86fdf4d9-bff1-40f5-b1f7-7d74536c7f39\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qtpbj" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.435848 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngftf\" (UniqueName: \"kubernetes.io/projected/98db182b-146e-48eb-918d-ff62909f62de-kube-api-access-ngftf\") pod \"cert-manager-5b446d88c5-xlhf9\" (UID: \"98db182b-146e-48eb-918d-ff62909f62de\") " pod="cert-manager/cert-manager-5b446d88c5-xlhf9" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.438850 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr5bh\" (UniqueName: \"kubernetes.io/projected/14b59c49-2ca7-4fd1-96a7-926474663fc8-kube-api-access-fr5bh\") pod \"cert-manager-webhook-5655c58dd6-9x457\" (UID: \"14b59c49-2ca7-4fd1-96a7-926474663fc8\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9x457" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.440644 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-667c4\" (UniqueName: \"kubernetes.io/projected/86fdf4d9-bff1-40f5-b1f7-7d74536c7f39-kube-api-access-667c4\") pod \"cert-manager-cainjector-7f985d654d-qtpbj\" (UID: \"86fdf4d9-bff1-40f5-b1f7-7d74536c7f39\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qtpbj" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.531271 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qtpbj" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.556243 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9x457" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.587771 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xlhf9" Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.745664 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qtpbj"] Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.754934 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:06:48 crc kubenswrapper[4728]: I1216 15:06:48.848735 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xlhf9"] Dec 16 15:06:49 crc kubenswrapper[4728]: I1216 15:06:49.013142 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9x457"] Dec 16 15:06:49 crc kubenswrapper[4728]: W1216 15:06:49.019460 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b59c49_2ca7_4fd1_96a7_926474663fc8.slice/crio-9d4e7182239c17a10b333e9b7169fd8b0aad77a08a6c10c7b2ff14c2bb250538 WatchSource:0}: Error finding container 9d4e7182239c17a10b333e9b7169fd8b0aad77a08a6c10c7b2ff14c2bb250538: Status 404 returned error can't find the container with id 9d4e7182239c17a10b333e9b7169fd8b0aad77a08a6c10c7b2ff14c2bb250538 Dec 16 15:06:49 crc kubenswrapper[4728]: I1216 15:06:49.528922 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9x457" event={"ID":"14b59c49-2ca7-4fd1-96a7-926474663fc8","Type":"ContainerStarted","Data":"9d4e7182239c17a10b333e9b7169fd8b0aad77a08a6c10c7b2ff14c2bb250538"} Dec 16 15:06:49 crc kubenswrapper[4728]: I1216 15:06:49.530976 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qtpbj" event={"ID":"86fdf4d9-bff1-40f5-b1f7-7d74536c7f39","Type":"ContainerStarted","Data":"20aba5b477508b8ea94771959fa27f2c656a5cdd33f13bb35d38e06bfa191305"} Dec 16 15:06:49 crc kubenswrapper[4728]: I1216 15:06:49.534631 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xlhf9" event={"ID":"98db182b-146e-48eb-918d-ff62909f62de","Type":"ContainerStarted","Data":"a7a6006be354ffaadf957e08879416199123f9bd0a51d25bf80bff19a1184bf9"} Dec 16 15:06:52 crc kubenswrapper[4728]: I1216 15:06:52.557710 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xlhf9" event={"ID":"98db182b-146e-48eb-918d-ff62909f62de","Type":"ContainerStarted","Data":"b8d9c2fd1ad50be9e64dd20f360efd53c4eedd655c782bb22fbdd950671c85f2"} Dec 16 15:06:52 crc kubenswrapper[4728]: I1216 15:06:52.559669 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9x457" event={"ID":"14b59c49-2ca7-4fd1-96a7-926474663fc8","Type":"ContainerStarted","Data":"626823f1ff874eb87e42a9be02bffd18da0c6274850bd15be0373e2589bee49a"} Dec 16 15:06:52 crc kubenswrapper[4728]: I1216 15:06:52.560204 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-9x457" Dec 16 15:06:52 crc kubenswrapper[4728]: I1216 15:06:52.574102 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-xlhf9" podStartSLOduration=2.069885818 podStartE2EDuration="4.574079995s" podCreationTimestamp="2025-12-16 15:06:48 +0000 UTC" firstStartedPulling="2025-12-16 15:06:48.856699703 +0000 UTC m=+589.696878687" lastFinishedPulling="2025-12-16 15:06:51.36089384 +0000 UTC m=+592.201072864" observedRunningTime="2025-12-16 15:06:52.571483543 +0000 UTC m=+593.411662527" watchObservedRunningTime="2025-12-16 15:06:52.574079995 +0000 UTC m=+593.414258979" Dec 16 15:06:52 crc kubenswrapper[4728]: I1216 15:06:52.594612 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-9x457" podStartSLOduration=2.193785031 podStartE2EDuration="4.594585773s" podCreationTimestamp="2025-12-16 15:06:48 +0000 UTC" firstStartedPulling="2025-12-16 15:06:49.022031495 +0000 UTC m=+589.862210469" lastFinishedPulling="2025-12-16 15:06:51.422832187 +0000 UTC m=+592.263011211" observedRunningTime="2025-12-16 15:06:52.593865523 +0000 UTC m=+593.434044537" watchObservedRunningTime="2025-12-16 15:06:52.594585773 +0000 UTC m=+593.434764757" Dec 16 15:06:53 crc kubenswrapper[4728]: I1216 15:06:53.568510 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qtpbj" event={"ID":"86fdf4d9-bff1-40f5-b1f7-7d74536c7f39","Type":"ContainerStarted","Data":"ce795823ce82f271ee851a92001b4a2acaef64488e24094c4cd4582a284e4246"} Dec 16 15:06:53 crc kubenswrapper[4728]: I1216 15:06:53.591851 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-qtpbj" podStartSLOduration=1.613796507 podStartE2EDuration="5.591823673s" podCreationTimestamp="2025-12-16 15:06:48 +0000 UTC" firstStartedPulling="2025-12-16 15:06:48.754614174 +0000 UTC m=+589.594793158" lastFinishedPulling="2025-12-16 15:06:52.73264134 +0000 UTC m=+593.572820324" observedRunningTime="2025-12-16 15:06:53.586142235 +0000 UTC m=+594.426321279" watchObservedRunningTime="2025-12-16 15:06:53.591823673 +0000 UTC m=+594.432002697" Dec 16 15:06:58 crc kubenswrapper[4728]: I1216 15:06:58.559597 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-9x457" Dec 16 15:06:58 crc kubenswrapper[4728]: I1216 15:06:58.775704 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2458v"] Dec 16 15:06:58 crc kubenswrapper[4728]: I1216 15:06:58.777137 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovn-controller" containerID="cri-o://7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4" gracePeriod=30 Dec 16 15:06:58 crc kubenswrapper[4728]: I1216 15:06:58.777199 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="nbdb" containerID="cri-o://fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e" gracePeriod=30 Dec 16 15:06:58 crc kubenswrapper[4728]: I1216 15:06:58.777244 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d" gracePeriod=30 Dec 16 15:06:58 crc kubenswrapper[4728]: I1216 15:06:58.777333 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="sbdb" containerID="cri-o://a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de" gracePeriod=30 Dec 16 15:06:58 crc kubenswrapper[4728]: I1216 15:06:58.777357 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="northd" containerID="cri-o://99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc" gracePeriod=30 Dec 16 15:06:58 crc kubenswrapper[4728]: I1216 15:06:58.777392 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovn-acl-logging" containerID="cri-o://18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b" gracePeriod=30 Dec 16 15:06:58 crc kubenswrapper[4728]: I1216 15:06:58.777286 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="kube-rbac-proxy-node" containerID="cri-o://b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0" gracePeriod=30 Dec 16 15:06:58 crc kubenswrapper[4728]: I1216 15:06:58.816722 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" containerID="cri-o://dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3" gracePeriod=30 Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.149906 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/3.log" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.153833 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovn-acl-logging/0.log" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.154687 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovn-controller/0.log" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.155846 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237105 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zpztt"] Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237341 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237355 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237364 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovn-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237373 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovn-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237386 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237396 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237611 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="kubecfg-setup" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237623 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="kubecfg-setup" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237633 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="nbdb" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237641 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="nbdb" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237651 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovn-acl-logging" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237659 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovn-acl-logging" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237670 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="sbdb" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237677 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="sbdb" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237692 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="northd" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237699 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="northd" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237710 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237717 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237728 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237738 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237745 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237753 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.237768 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="kube-rbac-proxy-node" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237776 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="kube-rbac-proxy-node" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237890 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovn-acl-logging" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237903 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="sbdb" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237910 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="nbdb" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237921 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237928 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237939 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237948 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237958 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="kube-rbac-proxy-ovn-metrics" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237968 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="kube-rbac-proxy-node" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237979 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="northd" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.237988 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovn-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.238099 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.238107 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.238214 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerName="ovnkube-controller" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.240032 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253060 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-openvswitch\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253098 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-netd\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253125 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-etc-openvswitch\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253149 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-var-lib-openvswitch\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253145 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253167 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-slash\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253200 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-slash" (OuterVolumeSpecName: "host-slash") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253278 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253287 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-log-socket\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253345 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-script-lib\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253385 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-systemd\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253277 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253315 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-log-socket" (OuterVolumeSpecName: "log-socket") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253328 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253487 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-ovn-kubernetes\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253520 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-node-log\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253559 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-kubelet\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253595 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-node-log" (OuterVolumeSpecName: "node-log") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253603 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253619 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253595 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/480f8c1b-60cc-4685-86cc-a457f645e87c-ovn-node-metrics-cert\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253754 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwj7t\" (UniqueName: \"kubernetes.io/projected/480f8c1b-60cc-4685-86cc-a457f645e87c-kube-api-access-jwj7t\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253817 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-env-overrides\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253888 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253940 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-ovn\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.253986 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-systemd-units\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254029 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-bin\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254085 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-netns\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254138 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-config\") pod \"480f8c1b-60cc-4685-86cc-a457f645e87c\" (UID: \"480f8c1b-60cc-4685-86cc-a457f645e87c\") " Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254326 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254470 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254544 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254539 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254583 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254600 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254714 4728 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254734 4728 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254751 4728 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-slash\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254765 4728 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-log-socket\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254778 4728 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254793 4728 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254807 4728 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-node-log\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254820 4728 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254836 4728 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254845 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254851 4728 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254887 4728 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254903 4728 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254917 4728 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254930 4728 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254941 4728 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.254988 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.260891 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/480f8c1b-60cc-4685-86cc-a457f645e87c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.261356 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480f8c1b-60cc-4685-86cc-a457f645e87c-kube-api-access-jwj7t" (OuterVolumeSpecName: "kube-api-access-jwj7t") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "kube-api-access-jwj7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.284772 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "480f8c1b-60cc-4685-86cc-a457f645e87c" (UID: "480f8c1b-60cc-4685-86cc-a457f645e87c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356170 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-etc-openvswitch\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356239 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356264 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5f256a94-2907-4e9f-a90e-8a610cca5cc7-ovn-node-metrics-cert\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356286 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-cni-bin\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356317 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-systemd-units\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356346 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-cni-netd\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356372 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-node-log\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356398 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-var-lib-openvswitch\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356490 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5f256a94-2907-4e9f-a90e-8a610cca5cc7-ovnkube-script-lib\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356522 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-run-systemd\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356549 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5f256a94-2907-4e9f-a90e-8a610cca5cc7-ovnkube-config\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356750 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-kubelet\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356829 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-log-socket\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356919 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-slash\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.356968 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-run-netns\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.357002 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-run-ovn\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.357048 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.357083 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsvs6\" (UniqueName: \"kubernetes.io/projected/5f256a94-2907-4e9f-a90e-8a610cca5cc7-kube-api-access-wsvs6\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.357146 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5f256a94-2907-4e9f-a90e-8a610cca5cc7-env-overrides\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.357187 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-run-openvswitch\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.357327 4728 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/480f8c1b-60cc-4685-86cc-a457f645e87c-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.357355 4728 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/480f8c1b-60cc-4685-86cc-a457f645e87c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.357376 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwj7t\" (UniqueName: \"kubernetes.io/projected/480f8c1b-60cc-4685-86cc-a457f645e87c-kube-api-access-jwj7t\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.357395 4728 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.357446 4728 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/480f8c1b-60cc-4685-86cc-a457f645e87c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.458980 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-run-openvswitch\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459108 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-etc-openvswitch\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459156 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-run-openvswitch\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459238 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459166 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459346 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5f256a94-2907-4e9f-a90e-8a610cca5cc7-ovn-node-metrics-cert\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459257 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-etc-openvswitch\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459392 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-cni-bin\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459474 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-systemd-units\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459521 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-cni-netd\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459685 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-cni-netd\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459673 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-node-log\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459746 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-var-lib-openvswitch\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459811 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5f256a94-2907-4e9f-a90e-8a610cca5cc7-ovnkube-script-lib\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459852 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-run-systemd\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459880 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5f256a94-2907-4e9f-a90e-8a610cca5cc7-ovnkube-config\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459614 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-systemd-units\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459573 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-cni-bin\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459748 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-node-log\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459956 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-run-systemd\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.459781 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-var-lib-openvswitch\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460144 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-kubelet\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460184 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-log-socket\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460247 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-slash\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460288 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-run-netns\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460324 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-log-socket\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460335 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-run-ovn\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460381 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-run-ovn\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460468 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-kubelet\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460534 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-run-netns\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460586 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-slash\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460381 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsvs6\" (UniqueName: \"kubernetes.io/projected/5f256a94-2907-4e9f-a90e-8a610cca5cc7-kube-api-access-wsvs6\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460638 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460683 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5f256a94-2907-4e9f-a90e-8a610cca5cc7-env-overrides\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.460860 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f256a94-2907-4e9f-a90e-8a610cca5cc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.461313 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5f256a94-2907-4e9f-a90e-8a610cca5cc7-ovnkube-script-lib\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.461457 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5f256a94-2907-4e9f-a90e-8a610cca5cc7-env-overrides\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.461870 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5f256a94-2907-4e9f-a90e-8a610cca5cc7-ovnkube-config\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.464015 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5f256a94-2907-4e9f-a90e-8a610cca5cc7-ovn-node-metrics-cert\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.488575 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsvs6\" (UniqueName: \"kubernetes.io/projected/5f256a94-2907-4e9f-a90e-8a610cca5cc7-kube-api-access-wsvs6\") pod \"ovnkube-node-zpztt\" (UID: \"5f256a94-2907-4e9f-a90e-8a610cca5cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.563981 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.617976 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovnkube-controller/3.log" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.620814 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovn-acl-logging/0.log" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.621357 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2458v_480f8c1b-60cc-4685-86cc-a457f645e87c/ovn-controller/0.log" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.621744 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3" exitCode=0 Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.621776 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de" exitCode=0 Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.621786 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e" exitCode=0 Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.621795 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc" exitCode=0 Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.621804 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d" exitCode=0 Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.621812 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0" exitCode=0 Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.621968 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b" exitCode=143 Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.621976 4728 generic.go:334] "Generic (PLEG): container finished" podID="480f8c1b-60cc-4685-86cc-a457f645e87c" containerID="7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4" exitCode=143 Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622018 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622049 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622062 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622075 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622089 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622101 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622113 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622124 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622131 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622138 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622145 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622151 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622158 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622165 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622171 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622180 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622191 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622198 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622205 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622212 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622218 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622225 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622233 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622239 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622246 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622253 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622262 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622271 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622279 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622286 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622293 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622300 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622306 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622313 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622320 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622326 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622334 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622343 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" event={"ID":"480f8c1b-60cc-4685-86cc-a457f645e87c","Type":"ContainerDied","Data":"f6a5dea1b098263c8c76edc066809df48b63e0ac843c23652034542da629e763"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622353 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622360 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622367 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622374 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622380 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622387 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622393 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622399 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622425 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622432 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622448 4728 scope.go:117] "RemoveContainer" containerID="dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.622644 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2458v" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.630683 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdpsg_57f7e48b-7353-469c-ab9d-7f966c08d5f1/kube-multus/2.log" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.631233 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdpsg_57f7e48b-7353-469c-ab9d-7f966c08d5f1/kube-multus/1.log" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.631287 4728 generic.go:334] "Generic (PLEG): container finished" podID="57f7e48b-7353-469c-ab9d-7f966c08d5f1" containerID="e87cfd286c066fb2008d76673f5ffbf9c66c1224fbc2a064ad159a47c3c27d99" exitCode=2 Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.631381 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdpsg" event={"ID":"57f7e48b-7353-469c-ab9d-7f966c08d5f1","Type":"ContainerDied","Data":"e87cfd286c066fb2008d76673f5ffbf9c66c1224fbc2a064ad159a47c3c27d99"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.631447 4728 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.632188 4728 scope.go:117] "RemoveContainer" containerID="e87cfd286c066fb2008d76673f5ffbf9c66c1224fbc2a064ad159a47c3c27d99" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.632519 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bdpsg_openshift-multus(57f7e48b-7353-469c-ab9d-7f966c08d5f1)\"" pod="openshift-multus/multus-bdpsg" podUID="57f7e48b-7353-469c-ab9d-7f966c08d5f1" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.634302 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" event={"ID":"5f256a94-2907-4e9f-a90e-8a610cca5cc7","Type":"ContainerStarted","Data":"95abc546c4ed91e617e2b4e448c87d9a167b4a2c891395157ecbc166813e8835"} Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.645332 4728 scope.go:117] "RemoveContainer" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.648961 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2458v"] Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.669058 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2458v"] Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.691076 4728 scope.go:117] "RemoveContainer" containerID="a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.733460 4728 scope.go:117] "RemoveContainer" containerID="fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.754193 4728 scope.go:117] "RemoveContainer" containerID="99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.775961 4728 scope.go:117] "RemoveContainer" containerID="a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.846425 4728 scope.go:117] "RemoveContainer" containerID="b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.858532 4728 scope.go:117] "RemoveContainer" containerID="18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.870619 4728 scope.go:117] "RemoveContainer" containerID="7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.883583 4728 scope.go:117] "RemoveContainer" containerID="582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.902046 4728 scope.go:117] "RemoveContainer" containerID="dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.902608 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3\": container with ID starting with dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3 not found: ID does not exist" containerID="dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.902649 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3"} err="failed to get container status \"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3\": rpc error: code = NotFound desc = could not find container \"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3\": container with ID starting with dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.902676 4728 scope.go:117] "RemoveContainer" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.903059 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\": container with ID starting with 0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b not found: ID does not exist" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.903089 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b"} err="failed to get container status \"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\": rpc error: code = NotFound desc = could not find container \"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\": container with ID starting with 0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.903152 4728 scope.go:117] "RemoveContainer" containerID="a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.903496 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\": container with ID starting with a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de not found: ID does not exist" containerID="a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.903571 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de"} err="failed to get container status \"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\": rpc error: code = NotFound desc = could not find container \"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\": container with ID starting with a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.903593 4728 scope.go:117] "RemoveContainer" containerID="fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.903880 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\": container with ID starting with fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e not found: ID does not exist" containerID="fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.903902 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e"} err="failed to get container status \"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\": rpc error: code = NotFound desc = could not find container \"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\": container with ID starting with fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.903918 4728 scope.go:117] "RemoveContainer" containerID="99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.904147 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\": container with ID starting with 99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc not found: ID does not exist" containerID="99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.904184 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc"} err="failed to get container status \"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\": rpc error: code = NotFound desc = could not find container \"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\": container with ID starting with 99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.904202 4728 scope.go:117] "RemoveContainer" containerID="a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.904433 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\": container with ID starting with a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d not found: ID does not exist" containerID="a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.904457 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d"} err="failed to get container status \"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\": rpc error: code = NotFound desc = could not find container \"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\": container with ID starting with a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.904475 4728 scope.go:117] "RemoveContainer" containerID="b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.904717 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\": container with ID starting with b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0 not found: ID does not exist" containerID="b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.904745 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0"} err="failed to get container status \"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\": rpc error: code = NotFound desc = could not find container \"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\": container with ID starting with b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.904764 4728 scope.go:117] "RemoveContainer" containerID="18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.905020 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\": container with ID starting with 18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b not found: ID does not exist" containerID="18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.905042 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b"} err="failed to get container status \"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\": rpc error: code = NotFound desc = could not find container \"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\": container with ID starting with 18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.905057 4728 scope.go:117] "RemoveContainer" containerID="7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.905299 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\": container with ID starting with 7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4 not found: ID does not exist" containerID="7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.905352 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4"} err="failed to get container status \"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\": rpc error: code = NotFound desc = could not find container \"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\": container with ID starting with 7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.905376 4728 scope.go:117] "RemoveContainer" containerID="582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5" Dec 16 15:06:59 crc kubenswrapper[4728]: E1216 15:06:59.905636 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\": container with ID starting with 582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5 not found: ID does not exist" containerID="582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.905657 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5"} err="failed to get container status \"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\": rpc error: code = NotFound desc = could not find container \"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\": container with ID starting with 582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.905674 4728 scope.go:117] "RemoveContainer" containerID="dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.906047 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3"} err="failed to get container status \"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3\": rpc error: code = NotFound desc = could not find container \"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3\": container with ID starting with dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.906133 4728 scope.go:117] "RemoveContainer" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.906516 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b"} err="failed to get container status \"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\": rpc error: code = NotFound desc = could not find container \"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\": container with ID starting with 0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.906537 4728 scope.go:117] "RemoveContainer" containerID="a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.906830 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de"} err="failed to get container status \"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\": rpc error: code = NotFound desc = could not find container \"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\": container with ID starting with a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.907089 4728 scope.go:117] "RemoveContainer" containerID="fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.907500 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e"} err="failed to get container status \"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\": rpc error: code = NotFound desc = could not find container \"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\": container with ID starting with fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.907535 4728 scope.go:117] "RemoveContainer" containerID="99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.908319 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc"} err="failed to get container status \"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\": rpc error: code = NotFound desc = could not find container \"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\": container with ID starting with 99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.908388 4728 scope.go:117] "RemoveContainer" containerID="a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.908847 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d"} err="failed to get container status \"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\": rpc error: code = NotFound desc = could not find container \"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\": container with ID starting with a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.908877 4728 scope.go:117] "RemoveContainer" containerID="b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.909119 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0"} err="failed to get container status \"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\": rpc error: code = NotFound desc = could not find container \"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\": container with ID starting with b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.909146 4728 scope.go:117] "RemoveContainer" containerID="18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.909350 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b"} err="failed to get container status \"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\": rpc error: code = NotFound desc = could not find container \"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\": container with ID starting with 18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.909378 4728 scope.go:117] "RemoveContainer" containerID="7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.909612 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4"} err="failed to get container status \"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\": rpc error: code = NotFound desc = could not find container \"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\": container with ID starting with 7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.909662 4728 scope.go:117] "RemoveContainer" containerID="582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.910045 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5"} err="failed to get container status \"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\": rpc error: code = NotFound desc = could not find container \"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\": container with ID starting with 582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.910065 4728 scope.go:117] "RemoveContainer" containerID="dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.910302 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3"} err="failed to get container status \"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3\": rpc error: code = NotFound desc = could not find container \"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3\": container with ID starting with dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.910334 4728 scope.go:117] "RemoveContainer" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.910654 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b"} err="failed to get container status \"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\": rpc error: code = NotFound desc = could not find container \"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\": container with ID starting with 0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.910705 4728 scope.go:117] "RemoveContainer" containerID="a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.911049 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de"} err="failed to get container status \"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\": rpc error: code = NotFound desc = could not find container \"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\": container with ID starting with a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.911076 4728 scope.go:117] "RemoveContainer" containerID="fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.911324 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e"} err="failed to get container status \"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\": rpc error: code = NotFound desc = could not find container \"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\": container with ID starting with fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.911354 4728 scope.go:117] "RemoveContainer" containerID="99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.911690 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc"} err="failed to get container status \"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\": rpc error: code = NotFound desc = could not find container \"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\": container with ID starting with 99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.911737 4728 scope.go:117] "RemoveContainer" containerID="a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.912656 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d"} err="failed to get container status \"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\": rpc error: code = NotFound desc = could not find container \"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\": container with ID starting with a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.912708 4728 scope.go:117] "RemoveContainer" containerID="b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.913201 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0"} err="failed to get container status \"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\": rpc error: code = NotFound desc = could not find container \"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\": container with ID starting with b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.913277 4728 scope.go:117] "RemoveContainer" containerID="18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.913680 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b"} err="failed to get container status \"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\": rpc error: code = NotFound desc = could not find container \"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\": container with ID starting with 18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.913711 4728 scope.go:117] "RemoveContainer" containerID="7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.913997 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4"} err="failed to get container status \"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\": rpc error: code = NotFound desc = could not find container \"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\": container with ID starting with 7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.914028 4728 scope.go:117] "RemoveContainer" containerID="582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.914801 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5"} err="failed to get container status \"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\": rpc error: code = NotFound desc = could not find container \"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\": container with ID starting with 582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.914876 4728 scope.go:117] "RemoveContainer" containerID="dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.915176 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3"} err="failed to get container status \"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3\": rpc error: code = NotFound desc = could not find container \"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3\": container with ID starting with dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.915236 4728 scope.go:117] "RemoveContainer" containerID="0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.915657 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b"} err="failed to get container status \"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\": rpc error: code = NotFound desc = could not find container \"0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b\": container with ID starting with 0a301b009e58a0d15f541826658667de3fa0fa59e3d9d7201257d834e5f12e8b not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.915693 4728 scope.go:117] "RemoveContainer" containerID="a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.916012 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de"} err="failed to get container status \"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\": rpc error: code = NotFound desc = could not find container \"a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de\": container with ID starting with a331ea208fb2560f70c61ec37c417649726e52f784806f315f4e7c69df67a8de not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.916042 4728 scope.go:117] "RemoveContainer" containerID="fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.916451 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e"} err="failed to get container status \"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\": rpc error: code = NotFound desc = could not find container \"fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e\": container with ID starting with fc2da3e0288df7d2233862beb92af2369beaaa76d96b4de049cc62e08ef8201e not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.916487 4728 scope.go:117] "RemoveContainer" containerID="99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.916878 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc"} err="failed to get container status \"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\": rpc error: code = NotFound desc = could not find container \"99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc\": container with ID starting with 99d3507dbe06c9cd05f4a504bf01823cec7c9816149c2ad50bc99b4e7220e8cc not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.916911 4728 scope.go:117] "RemoveContainer" containerID="a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.917383 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d"} err="failed to get container status \"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\": rpc error: code = NotFound desc = could not find container \"a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d\": container with ID starting with a0895a8e2a274f33aee9e46f5909e29bafb2ed97621a3c5f07a8e715ffff1e3d not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.917485 4728 scope.go:117] "RemoveContainer" containerID="b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.917858 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0"} err="failed to get container status \"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\": rpc error: code = NotFound desc = could not find container \"b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0\": container with ID starting with b15a0fc7349dd801c5cc5b8e381b1b0893a52cd04bc7c04285f97cccbef492b0 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.917887 4728 scope.go:117] "RemoveContainer" containerID="18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.918096 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b"} err="failed to get container status \"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\": rpc error: code = NotFound desc = could not find container \"18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b\": container with ID starting with 18121c80118cd6f4daafeb2ffef3cbed6cc5f55983a05c1a07c8adea2ad3265b not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.918126 4728 scope.go:117] "RemoveContainer" containerID="7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.918461 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4"} err="failed to get container status \"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\": rpc error: code = NotFound desc = could not find container \"7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4\": container with ID starting with 7d33035ec7123ba2196004b7ca8ebdb466b169c194d1c0ecd0e4adbd6757eea4 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.918490 4728 scope.go:117] "RemoveContainer" containerID="582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.918890 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5"} err="failed to get container status \"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\": rpc error: code = NotFound desc = could not find container \"582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5\": container with ID starting with 582d27c36a35e80380a092a7ec8edc7008efc9892932885df52df10a573c4ed5 not found: ID does not exist" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.918973 4728 scope.go:117] "RemoveContainer" containerID="dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3" Dec 16 15:06:59 crc kubenswrapper[4728]: I1216 15:06:59.919374 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3"} err="failed to get container status \"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3\": rpc error: code = NotFound desc = could not find container \"dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3\": container with ID starting with dc95992bc2647eb8d6879751599730e8aef649f4d9de2a4203d6d5f9b2814dd3 not found: ID does not exist" Dec 16 15:07:00 crc kubenswrapper[4728]: I1216 15:07:00.641749 4728 generic.go:334] "Generic (PLEG): container finished" podID="5f256a94-2907-4e9f-a90e-8a610cca5cc7" containerID="a907ceedca33572da82fc8e6de0e4359060ae05d3855fb4b67c122864fe0b25a" exitCode=0 Dec 16 15:07:00 crc kubenswrapper[4728]: I1216 15:07:00.641844 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" event={"ID":"5f256a94-2907-4e9f-a90e-8a610cca5cc7","Type":"ContainerDied","Data":"a907ceedca33572da82fc8e6de0e4359060ae05d3855fb4b67c122864fe0b25a"} Dec 16 15:07:01 crc kubenswrapper[4728]: I1216 15:07:01.513509 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480f8c1b-60cc-4685-86cc-a457f645e87c" path="/var/lib/kubelet/pods/480f8c1b-60cc-4685-86cc-a457f645e87c/volumes" Dec 16 15:07:01 crc kubenswrapper[4728]: I1216 15:07:01.648145 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" event={"ID":"5f256a94-2907-4e9f-a90e-8a610cca5cc7","Type":"ContainerStarted","Data":"7de6931aeef407d4b0c16bc1556478a1a9a0b6bbe6a1e6c12f0405f195c31eb2"} Dec 16 15:07:01 crc kubenswrapper[4728]: I1216 15:07:01.648190 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" event={"ID":"5f256a94-2907-4e9f-a90e-8a610cca5cc7","Type":"ContainerStarted","Data":"a073d317738b950c003d5f25dd608f53b6e3a01c973788c4ff77e9b279743485"} Dec 16 15:07:01 crc kubenswrapper[4728]: I1216 15:07:01.648204 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" event={"ID":"5f256a94-2907-4e9f-a90e-8a610cca5cc7","Type":"ContainerStarted","Data":"a0a97a7ec7444734aa31989aba9a28d4e16be30e2bf9679b55d8987ae94e2d38"} Dec 16 15:07:01 crc kubenswrapper[4728]: I1216 15:07:01.648215 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" event={"ID":"5f256a94-2907-4e9f-a90e-8a610cca5cc7","Type":"ContainerStarted","Data":"78e2a3d879eef7be9314187ead3a64dbd73522eb84b50dc2976f9bd9fc66535a"} Dec 16 15:07:01 crc kubenswrapper[4728]: I1216 15:07:01.648227 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" event={"ID":"5f256a94-2907-4e9f-a90e-8a610cca5cc7","Type":"ContainerStarted","Data":"1fca89c47ca00a9ae6539c75310fc87a67741d5f95b6c6a188c3da6e2d37680c"} Dec 16 15:07:01 crc kubenswrapper[4728]: I1216 15:07:01.648237 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" event={"ID":"5f256a94-2907-4e9f-a90e-8a610cca5cc7","Type":"ContainerStarted","Data":"00341e6790470cd9e5c902c9ac8b83df298b3fc0dc7b140d66e4474f3041269a"} Dec 16 15:07:01 crc kubenswrapper[4728]: I1216 15:07:01.855819 4728 scope.go:117] "RemoveContainer" containerID="e18df7954027147ab0bb73754629f6cd84b0858d96f2f7c69c11aee60607c3b9" Dec 16 15:07:01 crc kubenswrapper[4728]: I1216 15:07:01.881273 4728 scope.go:117] "RemoveContainer" containerID="233967acac0c91bddd4b6ae2b5089ca6520a2f51f40521cdaed11d977eefdd42" Dec 16 15:07:01 crc kubenswrapper[4728]: I1216 15:07:01.899025 4728 scope.go:117] "RemoveContainer" containerID="3b508f4b5e573abda9f40fbc51d31e7edc096fac27ef88cf0697e8edb24c337e" Dec 16 15:07:01 crc kubenswrapper[4728]: I1216 15:07:01.913113 4728 scope.go:117] "RemoveContainer" containerID="1d60f4de214c5dc1127f403bbccd7a1fd6deb38032dee534b69868064d7b4076" Dec 16 15:07:02 crc kubenswrapper[4728]: I1216 15:07:02.658359 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdpsg_57f7e48b-7353-469c-ab9d-7f966c08d5f1/kube-multus/2.log" Dec 16 15:07:04 crc kubenswrapper[4728]: I1216 15:07:04.682381 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" event={"ID":"5f256a94-2907-4e9f-a90e-8a610cca5cc7","Type":"ContainerStarted","Data":"a2b56ea531171f51048723330c52b6c2e15deb23afef1f7a8ebffd4145b0c795"} Dec 16 15:07:06 crc kubenswrapper[4728]: I1216 15:07:06.699899 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" event={"ID":"5f256a94-2907-4e9f-a90e-8a610cca5cc7","Type":"ContainerStarted","Data":"e4102b5a0d2989654d8e2511ce0698cc92fc4ecedc7ae1a53ff39e7779aa9e0d"} Dec 16 15:07:06 crc kubenswrapper[4728]: I1216 15:07:06.700363 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:07:06 crc kubenswrapper[4728]: I1216 15:07:06.700671 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:07:06 crc kubenswrapper[4728]: I1216 15:07:06.726077 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:07:06 crc kubenswrapper[4728]: I1216 15:07:06.754928 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" podStartSLOduration=7.754909953 podStartE2EDuration="7.754909953s" podCreationTimestamp="2025-12-16 15:06:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:07:06.742315374 +0000 UTC m=+607.582494368" watchObservedRunningTime="2025-12-16 15:07:06.754909953 +0000 UTC m=+607.595088947" Dec 16 15:07:07 crc kubenswrapper[4728]: I1216 15:07:07.707216 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:07:07 crc kubenswrapper[4728]: I1216 15:07:07.753946 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:07:08 crc kubenswrapper[4728]: I1216 15:07:08.819108 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:07:08 crc kubenswrapper[4728]: I1216 15:07:08.819859 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:07:12 crc kubenswrapper[4728]: I1216 15:07:12.506666 4728 scope.go:117] "RemoveContainer" containerID="e87cfd286c066fb2008d76673f5ffbf9c66c1224fbc2a064ad159a47c3c27d99" Dec 16 15:07:12 crc kubenswrapper[4728]: E1216 15:07:12.506911 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bdpsg_openshift-multus(57f7e48b-7353-469c-ab9d-7f966c08d5f1)\"" pod="openshift-multus/multus-bdpsg" podUID="57f7e48b-7353-469c-ab9d-7f966c08d5f1" Dec 16 15:07:23 crc kubenswrapper[4728]: I1216 15:07:23.506834 4728 scope.go:117] "RemoveContainer" containerID="e87cfd286c066fb2008d76673f5ffbf9c66c1224fbc2a064ad159a47c3c27d99" Dec 16 15:07:24 crc kubenswrapper[4728]: I1216 15:07:24.831046 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bdpsg_57f7e48b-7353-469c-ab9d-7f966c08d5f1/kube-multus/2.log" Dec 16 15:07:24 crc kubenswrapper[4728]: I1216 15:07:24.832279 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bdpsg" event={"ID":"57f7e48b-7353-469c-ab9d-7f966c08d5f1","Type":"ContainerStarted","Data":"85611e03a45785a7ff10d604febfe728ce14f5c33db6cfb94770348f17f3c9ad"} Dec 16 15:07:29 crc kubenswrapper[4728]: I1216 15:07:29.601708 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zpztt" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.679779 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m"] Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.681284 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.684430 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.689782 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m"] Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.807933 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdsgd\" (UniqueName: \"kubernetes.io/projected/080ce000-f07f-47b8-ad87-0dd66e7a6fba-kube-api-access-wdsgd\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.807988 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.808047 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.908790 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.908844 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdsgd\" (UniqueName: \"kubernetes.io/projected/080ce000-f07f-47b8-ad87-0dd66e7a6fba-kube-api-access-wdsgd\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.908879 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.909267 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.909308 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.932463 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdsgd\" (UniqueName: \"kubernetes.io/projected/080ce000-f07f-47b8-ad87-0dd66e7a6fba-kube-api-access-wdsgd\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:37 crc kubenswrapper[4728]: I1216 15:07:37.998860 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:38 crc kubenswrapper[4728]: I1216 15:07:38.226802 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m"] Dec 16 15:07:38 crc kubenswrapper[4728]: I1216 15:07:38.818264 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:07:38 crc kubenswrapper[4728]: I1216 15:07:38.818329 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:07:38 crc kubenswrapper[4728]: I1216 15:07:38.818379 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:07:38 crc kubenswrapper[4728]: I1216 15:07:38.818994 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5556e0d6dfe6e1666b1eb820e6992928174cc0e89be80318dfc33d104f059a37"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:07:38 crc kubenswrapper[4728]: I1216 15:07:38.819052 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://5556e0d6dfe6e1666b1eb820e6992928174cc0e89be80318dfc33d104f059a37" gracePeriod=600 Dec 16 15:07:38 crc kubenswrapper[4728]: I1216 15:07:38.917908 4728 generic.go:334] "Generic (PLEG): container finished" podID="080ce000-f07f-47b8-ad87-0dd66e7a6fba" containerID="cefdc739044043ea77022073e66d45beb178bc403bb92b58dc361ad0a71dfcc6" exitCode=0 Dec 16 15:07:38 crc kubenswrapper[4728]: I1216 15:07:38.917959 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" event={"ID":"080ce000-f07f-47b8-ad87-0dd66e7a6fba","Type":"ContainerDied","Data":"cefdc739044043ea77022073e66d45beb178bc403bb92b58dc361ad0a71dfcc6"} Dec 16 15:07:38 crc kubenswrapper[4728]: I1216 15:07:38.918189 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" event={"ID":"080ce000-f07f-47b8-ad87-0dd66e7a6fba","Type":"ContainerStarted","Data":"fd543e4722f1ab8beefe845ec9999bdb4cb34d22ee7111d6bfbb737696730591"} Dec 16 15:07:39 crc kubenswrapper[4728]: I1216 15:07:39.925756 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="5556e0d6dfe6e1666b1eb820e6992928174cc0e89be80318dfc33d104f059a37" exitCode=0 Dec 16 15:07:39 crc kubenswrapper[4728]: I1216 15:07:39.925807 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"5556e0d6dfe6e1666b1eb820e6992928174cc0e89be80318dfc33d104f059a37"} Dec 16 15:07:39 crc kubenswrapper[4728]: I1216 15:07:39.926114 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"0cc664ff3879b159126f992f52a6c4ccf1fc8c0903483566c983c5026f497d68"} Dec 16 15:07:39 crc kubenswrapper[4728]: I1216 15:07:39.926144 4728 scope.go:117] "RemoveContainer" containerID="4bda00ce73e1c1ab471f206d48aed0e38d16bcd1f6b879870ad51db12f879d97" Dec 16 15:07:40 crc kubenswrapper[4728]: I1216 15:07:40.937805 4728 generic.go:334] "Generic (PLEG): container finished" podID="080ce000-f07f-47b8-ad87-0dd66e7a6fba" containerID="861e61891daff01e410c4bfe4333e097946ed8af8b225e383ca48e476356a1bc" exitCode=0 Dec 16 15:07:40 crc kubenswrapper[4728]: I1216 15:07:40.937926 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" event={"ID":"080ce000-f07f-47b8-ad87-0dd66e7a6fba","Type":"ContainerDied","Data":"861e61891daff01e410c4bfe4333e097946ed8af8b225e383ca48e476356a1bc"} Dec 16 15:07:41 crc kubenswrapper[4728]: I1216 15:07:41.949681 4728 generic.go:334] "Generic (PLEG): container finished" podID="080ce000-f07f-47b8-ad87-0dd66e7a6fba" containerID="0f32dac26e6182c1488b51cff89c6cc06155ad5a6b3f83d097649185766b49f0" exitCode=0 Dec 16 15:07:41 crc kubenswrapper[4728]: I1216 15:07:41.949798 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" event={"ID":"080ce000-f07f-47b8-ad87-0dd66e7a6fba","Type":"ContainerDied","Data":"0f32dac26e6182c1488b51cff89c6cc06155ad5a6b3f83d097649185766b49f0"} Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.213029 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.386117 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdsgd\" (UniqueName: \"kubernetes.io/projected/080ce000-f07f-47b8-ad87-0dd66e7a6fba-kube-api-access-wdsgd\") pod \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.386308 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-bundle\") pod \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.386476 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-util\") pod \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\" (UID: \"080ce000-f07f-47b8-ad87-0dd66e7a6fba\") " Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.387975 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-bundle" (OuterVolumeSpecName: "bundle") pod "080ce000-f07f-47b8-ad87-0dd66e7a6fba" (UID: "080ce000-f07f-47b8-ad87-0dd66e7a6fba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.393577 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/080ce000-f07f-47b8-ad87-0dd66e7a6fba-kube-api-access-wdsgd" (OuterVolumeSpecName: "kube-api-access-wdsgd") pod "080ce000-f07f-47b8-ad87-0dd66e7a6fba" (UID: "080ce000-f07f-47b8-ad87-0dd66e7a6fba"). InnerVolumeSpecName "kube-api-access-wdsgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.405285 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-util" (OuterVolumeSpecName: "util") pod "080ce000-f07f-47b8-ad87-0dd66e7a6fba" (UID: "080ce000-f07f-47b8-ad87-0dd66e7a6fba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.488609 4728 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.488673 4728 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/080ce000-f07f-47b8-ad87-0dd66e7a6fba-util\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.488701 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdsgd\" (UniqueName: \"kubernetes.io/projected/080ce000-f07f-47b8-ad87-0dd66e7a6fba-kube-api-access-wdsgd\") on node \"crc\" DevicePath \"\"" Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.968391 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" event={"ID":"080ce000-f07f-47b8-ad87-0dd66e7a6fba","Type":"ContainerDied","Data":"fd543e4722f1ab8beefe845ec9999bdb4cb34d22ee7111d6bfbb737696730591"} Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.969210 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd543e4722f1ab8beefe845ec9999bdb4cb34d22ee7111d6bfbb737696730591" Dec 16 15:07:43 crc kubenswrapper[4728]: I1216 15:07:43.968531 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.226945 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-9mppg"] Dec 16 15:07:49 crc kubenswrapper[4728]: E1216 15:07:49.227345 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080ce000-f07f-47b8-ad87-0dd66e7a6fba" containerName="util" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.227358 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="080ce000-f07f-47b8-ad87-0dd66e7a6fba" containerName="util" Dec 16 15:07:49 crc kubenswrapper[4728]: E1216 15:07:49.227368 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080ce000-f07f-47b8-ad87-0dd66e7a6fba" containerName="pull" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.227374 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="080ce000-f07f-47b8-ad87-0dd66e7a6fba" containerName="pull" Dec 16 15:07:49 crc kubenswrapper[4728]: E1216 15:07:49.227388 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="080ce000-f07f-47b8-ad87-0dd66e7a6fba" containerName="extract" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.227394 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="080ce000-f07f-47b8-ad87-0dd66e7a6fba" containerName="extract" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.227501 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="080ce000-f07f-47b8-ad87-0dd66e7a6fba" containerName="extract" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.227837 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-9mppg" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.229686 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-pc77n" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.229728 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.230095 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.247142 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-9mppg"] Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.364226 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwfb7\" (UniqueName: \"kubernetes.io/projected/fdf13fea-12cb-4713-bae2-3cabd3aae756-kube-api-access-wwfb7\") pod \"nmstate-operator-6769fb99d-9mppg\" (UID: \"fdf13fea-12cb-4713-bae2-3cabd3aae756\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-9mppg" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.465480 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwfb7\" (UniqueName: \"kubernetes.io/projected/fdf13fea-12cb-4713-bae2-3cabd3aae756-kube-api-access-wwfb7\") pod \"nmstate-operator-6769fb99d-9mppg\" (UID: \"fdf13fea-12cb-4713-bae2-3cabd3aae756\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-9mppg" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.488132 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwfb7\" (UniqueName: \"kubernetes.io/projected/fdf13fea-12cb-4713-bae2-3cabd3aae756-kube-api-access-wwfb7\") pod \"nmstate-operator-6769fb99d-9mppg\" (UID: \"fdf13fea-12cb-4713-bae2-3cabd3aae756\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-9mppg" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.546750 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-9mppg" Dec 16 15:07:49 crc kubenswrapper[4728]: I1216 15:07:49.747539 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-9mppg"] Dec 16 15:07:49 crc kubenswrapper[4728]: W1216 15:07:49.750305 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf13fea_12cb_4713_bae2_3cabd3aae756.slice/crio-e840d096f30e49f6a61e6f5048dd14205fd184bf0cb0854aba377d220db8ded9 WatchSource:0}: Error finding container e840d096f30e49f6a61e6f5048dd14205fd184bf0cb0854aba377d220db8ded9: Status 404 returned error can't find the container with id e840d096f30e49f6a61e6f5048dd14205fd184bf0cb0854aba377d220db8ded9 Dec 16 15:07:50 crc kubenswrapper[4728]: I1216 15:07:50.001551 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-9mppg" event={"ID":"fdf13fea-12cb-4713-bae2-3cabd3aae756","Type":"ContainerStarted","Data":"e840d096f30e49f6a61e6f5048dd14205fd184bf0cb0854aba377d220db8ded9"} Dec 16 15:07:57 crc kubenswrapper[4728]: I1216 15:07:57.046397 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-9mppg" event={"ID":"fdf13fea-12cb-4713-bae2-3cabd3aae756","Type":"ContainerStarted","Data":"45def12a7775bb995c384f46663a5ba965ead27888ab0332e8a3e143d45e893c"} Dec 16 15:07:57 crc kubenswrapper[4728]: I1216 15:07:57.069556 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-9mppg" podStartSLOduration=1.14310962 podStartE2EDuration="8.069525756s" podCreationTimestamp="2025-12-16 15:07:49 +0000 UTC" firstStartedPulling="2025-12-16 15:07:49.75314631 +0000 UTC m=+650.593325294" lastFinishedPulling="2025-12-16 15:07:56.679562446 +0000 UTC m=+657.519741430" observedRunningTime="2025-12-16 15:07:57.068913999 +0000 UTC m=+657.909092983" watchObservedRunningTime="2025-12-16 15:07:57.069525756 +0000 UTC m=+657.909704780" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.140386 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc"] Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.141361 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.143686 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-z9l5d" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.159079 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc"] Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.167882 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9"] Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.168732 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.173378 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.177834 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj9zt\" (UniqueName: \"kubernetes.io/projected/f35491e6-33aa-4c1d-a9c0-1b95f43ad54f-kube-api-access-kj9zt\") pod \"nmstate-webhook-f8fb84555-r6qf9\" (UID: \"f35491e6-33aa-4c1d-a9c0-1b95f43ad54f\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.177886 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f35491e6-33aa-4c1d-a9c0-1b95f43ad54f-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-r6qf9\" (UID: \"f35491e6-33aa-4c1d-a9c0-1b95f43ad54f\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.177999 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8n78\" (UniqueName: \"kubernetes.io/projected/22a101d0-c77f-42c4-88e7-ff7bfb0c204d-kube-api-access-w8n78\") pod \"nmstate-metrics-7f7f7578db-fkppc\" (UID: \"22a101d0-c77f-42c4-88e7-ff7bfb0c204d\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.200898 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9"] Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.203723 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-hd8rz"] Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.204353 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.279292 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8n78\" (UniqueName: \"kubernetes.io/projected/22a101d0-c77f-42c4-88e7-ff7bfb0c204d-kube-api-access-w8n78\") pod \"nmstate-metrics-7f7f7578db-fkppc\" (UID: \"22a101d0-c77f-42c4-88e7-ff7bfb0c204d\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.279349 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj9zt\" (UniqueName: \"kubernetes.io/projected/f35491e6-33aa-4c1d-a9c0-1b95f43ad54f-kube-api-access-kj9zt\") pod \"nmstate-webhook-f8fb84555-r6qf9\" (UID: \"f35491e6-33aa-4c1d-a9c0-1b95f43ad54f\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.279377 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f35491e6-33aa-4c1d-a9c0-1b95f43ad54f-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-r6qf9\" (UID: \"f35491e6-33aa-4c1d-a9c0-1b95f43ad54f\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" Dec 16 15:07:58 crc kubenswrapper[4728]: E1216 15:07:58.279527 4728 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 16 15:07:58 crc kubenswrapper[4728]: E1216 15:07:58.279577 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35491e6-33aa-4c1d-a9c0-1b95f43ad54f-tls-key-pair podName:f35491e6-33aa-4c1d-a9c0-1b95f43ad54f nodeName:}" failed. No retries permitted until 2025-12-16 15:07:58.779560116 +0000 UTC m=+659.619739100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/f35491e6-33aa-4c1d-a9c0-1b95f43ad54f-tls-key-pair") pod "nmstate-webhook-f8fb84555-r6qf9" (UID: "f35491e6-33aa-4c1d-a9c0-1b95f43ad54f") : secret "openshift-nmstate-webhook" not found Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.303688 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj9zt\" (UniqueName: \"kubernetes.io/projected/f35491e6-33aa-4c1d-a9c0-1b95f43ad54f-kube-api-access-kj9zt\") pod \"nmstate-webhook-f8fb84555-r6qf9\" (UID: \"f35491e6-33aa-4c1d-a9c0-1b95f43ad54f\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.307384 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8n78\" (UniqueName: \"kubernetes.io/projected/22a101d0-c77f-42c4-88e7-ff7bfb0c204d-kube-api-access-w8n78\") pod \"nmstate-metrics-7f7f7578db-fkppc\" (UID: \"22a101d0-c77f-42c4-88e7-ff7bfb0c204d\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.381131 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfd7w\" (UniqueName: \"kubernetes.io/projected/d61cf9e1-67c0-4258-af87-e4244df3c68e-kube-api-access-zfd7w\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.381522 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d61cf9e1-67c0-4258-af87-e4244df3c68e-dbus-socket\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.381554 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d61cf9e1-67c0-4258-af87-e4244df3c68e-nmstate-lock\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.381608 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d61cf9e1-67c0-4258-af87-e4244df3c68e-ovs-socket\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.387110 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8"] Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.388101 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.390163 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.391213 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.392768 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4vklx" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.401257 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8"] Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.465882 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.482383 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfd7w\" (UniqueName: \"kubernetes.io/projected/d61cf9e1-67c0-4258-af87-e4244df3c68e-kube-api-access-zfd7w\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.482434 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d61cf9e1-67c0-4258-af87-e4244df3c68e-dbus-socket\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.482455 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d61cf9e1-67c0-4258-af87-e4244df3c68e-nmstate-lock\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.482490 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d61cf9e1-67c0-4258-af87-e4244df3c68e-ovs-socket\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.482520 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/84209333-74b0-4804-ac6e-e829f0ec1bc7-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-zzbs8\" (UID: \"84209333-74b0-4804-ac6e-e829f0ec1bc7\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.482542 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx2ws\" (UniqueName: \"kubernetes.io/projected/84209333-74b0-4804-ac6e-e829f0ec1bc7-kube-api-access-cx2ws\") pod \"nmstate-console-plugin-6ff7998486-zzbs8\" (UID: \"84209333-74b0-4804-ac6e-e829f0ec1bc7\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.482560 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/84209333-74b0-4804-ac6e-e829f0ec1bc7-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-zzbs8\" (UID: \"84209333-74b0-4804-ac6e-e829f0ec1bc7\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.482790 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d61cf9e1-67c0-4258-af87-e4244df3c68e-ovs-socket\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.482852 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d61cf9e1-67c0-4258-af87-e4244df3c68e-nmstate-lock\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.483206 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d61cf9e1-67c0-4258-af87-e4244df3c68e-dbus-socket\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.504854 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfd7w\" (UniqueName: \"kubernetes.io/projected/d61cf9e1-67c0-4258-af87-e4244df3c68e-kube-api-access-zfd7w\") pod \"nmstate-handler-hd8rz\" (UID: \"d61cf9e1-67c0-4258-af87-e4244df3c68e\") " pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.561211 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fb7678fb5-xhp52"] Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.562022 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.575373 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.580287 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb7678fb5-xhp52"] Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.583242 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmkxd\" (UniqueName: \"kubernetes.io/projected/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-kube-api-access-fmkxd\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.583296 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-console-config\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.583313 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-trusted-ca-bundle\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.583336 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-oauth-serving-cert\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.583353 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-service-ca\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.583382 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-console-serving-cert\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.583398 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-console-oauth-config\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.583479 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/84209333-74b0-4804-ac6e-e829f0ec1bc7-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-zzbs8\" (UID: \"84209333-74b0-4804-ac6e-e829f0ec1bc7\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.583502 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx2ws\" (UniqueName: \"kubernetes.io/projected/84209333-74b0-4804-ac6e-e829f0ec1bc7-kube-api-access-cx2ws\") pod \"nmstate-console-plugin-6ff7998486-zzbs8\" (UID: \"84209333-74b0-4804-ac6e-e829f0ec1bc7\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.583519 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/84209333-74b0-4804-ac6e-e829f0ec1bc7-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-zzbs8\" (UID: \"84209333-74b0-4804-ac6e-e829f0ec1bc7\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:58 crc kubenswrapper[4728]: E1216 15:07:58.583679 4728 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 16 15:07:58 crc kubenswrapper[4728]: E1216 15:07:58.583736 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84209333-74b0-4804-ac6e-e829f0ec1bc7-plugin-serving-cert podName:84209333-74b0-4804-ac6e-e829f0ec1bc7 nodeName:}" failed. No retries permitted until 2025-12-16 15:07:59.083717703 +0000 UTC m=+659.923896687 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/84209333-74b0-4804-ac6e-e829f0ec1bc7-plugin-serving-cert") pod "nmstate-console-plugin-6ff7998486-zzbs8" (UID: "84209333-74b0-4804-ac6e-e829f0ec1bc7") : secret "plugin-serving-cert" not found Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.584212 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/84209333-74b0-4804-ac6e-e829f0ec1bc7-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-zzbs8\" (UID: \"84209333-74b0-4804-ac6e-e829f0ec1bc7\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.617025 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx2ws\" (UniqueName: \"kubernetes.io/projected/84209333-74b0-4804-ac6e-e829f0ec1bc7-kube-api-access-cx2ws\") pod \"nmstate-console-plugin-6ff7998486-zzbs8\" (UID: \"84209333-74b0-4804-ac6e-e829f0ec1bc7\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.684500 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-console-config\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.684556 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-trusted-ca-bundle\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.684597 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-oauth-serving-cert\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.684621 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-service-ca\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.684663 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-console-serving-cert\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.684686 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-console-oauth-config\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.684758 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmkxd\" (UniqueName: \"kubernetes.io/projected/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-kube-api-access-fmkxd\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.686309 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-oauth-serving-cert\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.686390 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-service-ca\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.686751 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-trusted-ca-bundle\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.687241 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-console-config\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.689360 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-console-oauth-config\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.689554 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-console-serving-cert\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.701042 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmkxd\" (UniqueName: \"kubernetes.io/projected/7bb55fcf-3f65-4c0e-8d67-01448f62b60c-kube-api-access-fmkxd\") pod \"console-6fb7678fb5-xhp52\" (UID: \"7bb55fcf-3f65-4c0e-8d67-01448f62b60c\") " pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.734563 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc"] Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.785688 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f35491e6-33aa-4c1d-a9c0-1b95f43ad54f-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-r6qf9\" (UID: \"f35491e6-33aa-4c1d-a9c0-1b95f43ad54f\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.788775 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f35491e6-33aa-4c1d-a9c0-1b95f43ad54f-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-r6qf9\" (UID: \"f35491e6-33aa-4c1d-a9c0-1b95f43ad54f\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" Dec 16 15:07:58 crc kubenswrapper[4728]: I1216 15:07:58.880946 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:07:59 crc kubenswrapper[4728]: I1216 15:07:59.058312 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hd8rz" event={"ID":"d61cf9e1-67c0-4258-af87-e4244df3c68e","Type":"ContainerStarted","Data":"dcb5fc4244232b85c25efacfd393fd6955ea8683e33ccc860c028471746df3f0"} Dec 16 15:07:59 crc kubenswrapper[4728]: I1216 15:07:59.060470 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc" event={"ID":"22a101d0-c77f-42c4-88e7-ff7bfb0c204d","Type":"ContainerStarted","Data":"f61002bdacd3e95017510bd22898f3dc26720e4aeca4bf1a01146f9f77e4bd48"} Dec 16 15:07:59 crc kubenswrapper[4728]: I1216 15:07:59.069124 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb7678fb5-xhp52"] Dec 16 15:07:59 crc kubenswrapper[4728]: W1216 15:07:59.071822 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bb55fcf_3f65_4c0e_8d67_01448f62b60c.slice/crio-cd5addf87ef308a80a4473f9236ae1372aa7ce321443306034f8a0335e507be4 WatchSource:0}: Error finding container cd5addf87ef308a80a4473f9236ae1372aa7ce321443306034f8a0335e507be4: Status 404 returned error can't find the container with id cd5addf87ef308a80a4473f9236ae1372aa7ce321443306034f8a0335e507be4 Dec 16 15:07:59 crc kubenswrapper[4728]: I1216 15:07:59.083795 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" Dec 16 15:07:59 crc kubenswrapper[4728]: I1216 15:07:59.090724 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/84209333-74b0-4804-ac6e-e829f0ec1bc7-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-zzbs8\" (UID: \"84209333-74b0-4804-ac6e-e829f0ec1bc7\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:59 crc kubenswrapper[4728]: I1216 15:07:59.094883 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/84209333-74b0-4804-ac6e-e829f0ec1bc7-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-zzbs8\" (UID: \"84209333-74b0-4804-ac6e-e829f0ec1bc7\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:59 crc kubenswrapper[4728]: I1216 15:07:59.286151 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9"] Dec 16 15:07:59 crc kubenswrapper[4728]: W1216 15:07:59.293180 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf35491e6_33aa_4c1d_a9c0_1b95f43ad54f.slice/crio-e89b760b6ca61bcb1f6c5fd1c925cdd75d11805e1f963bb36fd4263d59def393 WatchSource:0}: Error finding container e89b760b6ca61bcb1f6c5fd1c925cdd75d11805e1f963bb36fd4263d59def393: Status 404 returned error can't find the container with id e89b760b6ca61bcb1f6c5fd1c925cdd75d11805e1f963bb36fd4263d59def393 Dec 16 15:07:59 crc kubenswrapper[4728]: I1216 15:07:59.308752 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" Dec 16 15:07:59 crc kubenswrapper[4728]: I1216 15:07:59.744317 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8"] Dec 16 15:07:59 crc kubenswrapper[4728]: W1216 15:07:59.778800 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84209333_74b0_4804_ac6e_e829f0ec1bc7.slice/crio-2894fb108fb0f1d7bd8572eb79a9f36e8bc6405649137263146f5f4189118b7e WatchSource:0}: Error finding container 2894fb108fb0f1d7bd8572eb79a9f36e8bc6405649137263146f5f4189118b7e: Status 404 returned error can't find the container with id 2894fb108fb0f1d7bd8572eb79a9f36e8bc6405649137263146f5f4189118b7e Dec 16 15:08:00 crc kubenswrapper[4728]: I1216 15:08:00.077669 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" event={"ID":"84209333-74b0-4804-ac6e-e829f0ec1bc7","Type":"ContainerStarted","Data":"2894fb108fb0f1d7bd8572eb79a9f36e8bc6405649137263146f5f4189118b7e"} Dec 16 15:08:00 crc kubenswrapper[4728]: I1216 15:08:00.080521 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb7678fb5-xhp52" event={"ID":"7bb55fcf-3f65-4c0e-8d67-01448f62b60c","Type":"ContainerStarted","Data":"cd5addf87ef308a80a4473f9236ae1372aa7ce321443306034f8a0335e507be4"} Dec 16 15:08:00 crc kubenswrapper[4728]: I1216 15:08:00.082834 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" event={"ID":"f35491e6-33aa-4c1d-a9c0-1b95f43ad54f","Type":"ContainerStarted","Data":"e89b760b6ca61bcb1f6c5fd1c925cdd75d11805e1f963bb36fd4263d59def393"} Dec 16 15:08:03 crc kubenswrapper[4728]: I1216 15:08:03.110655 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb7678fb5-xhp52" event={"ID":"7bb55fcf-3f65-4c0e-8d67-01448f62b60c","Type":"ContainerStarted","Data":"7e81d1a06e751beaee4225473148513a3be5cbcb4012db317f30da6d15845a64"} Dec 16 15:08:03 crc kubenswrapper[4728]: I1216 15:08:03.131449 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fb7678fb5-xhp52" podStartSLOduration=5.131429927 podStartE2EDuration="5.131429927s" podCreationTimestamp="2025-12-16 15:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:08:03.131105338 +0000 UTC m=+663.971284342" watchObservedRunningTime="2025-12-16 15:08:03.131429927 +0000 UTC m=+663.971608911" Dec 16 15:08:06 crc kubenswrapper[4728]: I1216 15:08:06.135254 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc" event={"ID":"22a101d0-c77f-42c4-88e7-ff7bfb0c204d","Type":"ContainerStarted","Data":"59e849a43f08460c1e19d9cb1f8e0b1384bd2fca61f9ea6b1bd163f18046b155"} Dec 16 15:08:06 crc kubenswrapper[4728]: I1216 15:08:06.136688 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" event={"ID":"f35491e6-33aa-4c1d-a9c0-1b95f43ad54f","Type":"ContainerStarted","Data":"ff8c4a77bebfe178d73aa0d4f2837d7093ee69b0a6b3a287556d8ae97ab60dd8"} Dec 16 15:08:06 crc kubenswrapper[4728]: I1216 15:08:06.136757 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" Dec 16 15:08:06 crc kubenswrapper[4728]: I1216 15:08:06.138549 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hd8rz" event={"ID":"d61cf9e1-67c0-4258-af87-e4244df3c68e","Type":"ContainerStarted","Data":"aeb93c0ef4a6787feed49c33ce34f2735f4f7a26b652f2b8fd5c563e2be8ed78"} Dec 16 15:08:06 crc kubenswrapper[4728]: I1216 15:08:06.139268 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:08:06 crc kubenswrapper[4728]: I1216 15:08:06.153482 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" podStartSLOduration=2.407006873 podStartE2EDuration="8.153469677s" podCreationTimestamp="2025-12-16 15:07:58 +0000 UTC" firstStartedPulling="2025-12-16 15:07:59.295274068 +0000 UTC m=+660.135453052" lastFinishedPulling="2025-12-16 15:08:05.041736872 +0000 UTC m=+665.881915856" observedRunningTime="2025-12-16 15:08:06.148576943 +0000 UTC m=+666.988755927" watchObservedRunningTime="2025-12-16 15:08:06.153469677 +0000 UTC m=+666.993648651" Dec 16 15:08:06 crc kubenswrapper[4728]: I1216 15:08:06.161514 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-hd8rz" podStartSLOduration=1.780779367 podStartE2EDuration="8.161463496s" podCreationTimestamp="2025-12-16 15:07:58 +0000 UTC" firstStartedPulling="2025-12-16 15:07:58.628677835 +0000 UTC m=+659.468856819" lastFinishedPulling="2025-12-16 15:08:05.009361924 +0000 UTC m=+665.849540948" observedRunningTime="2025-12-16 15:08:06.161381604 +0000 UTC m=+667.001560588" watchObservedRunningTime="2025-12-16 15:08:06.161463496 +0000 UTC m=+667.001642480" Dec 16 15:08:08 crc kubenswrapper[4728]: I1216 15:08:08.151122 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" event={"ID":"84209333-74b0-4804-ac6e-e829f0ec1bc7","Type":"ContainerStarted","Data":"26a2f4b93375acd64200e09577d02de4d8a8ce3e4e8beac59d5bd4535f884477"} Dec 16 15:08:08 crc kubenswrapper[4728]: I1216 15:08:08.169325 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-zzbs8" podStartSLOduration=2.053044552 podStartE2EDuration="10.169305716s" podCreationTimestamp="2025-12-16 15:07:58 +0000 UTC" firstStartedPulling="2025-12-16 15:07:59.781044035 +0000 UTC m=+660.621223009" lastFinishedPulling="2025-12-16 15:08:07.897305189 +0000 UTC m=+668.737484173" observedRunningTime="2025-12-16 15:08:08.167354302 +0000 UTC m=+669.007533286" watchObservedRunningTime="2025-12-16 15:08:08.169305716 +0000 UTC m=+669.009484700" Dec 16 15:08:08 crc kubenswrapper[4728]: I1216 15:08:08.882318 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:08:08 crc kubenswrapper[4728]: I1216 15:08:08.882367 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:08:08 crc kubenswrapper[4728]: I1216 15:08:08.892036 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:08:09 crc kubenswrapper[4728]: I1216 15:08:09.162185 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc" event={"ID":"22a101d0-c77f-42c4-88e7-ff7bfb0c204d","Type":"ContainerStarted","Data":"6cd5b553d0329a36a8ed2c75e64f68cad10ef1b21730dbe0cbb4e86950b63d04"} Dec 16 15:08:09 crc kubenswrapper[4728]: I1216 15:08:09.169448 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fb7678fb5-xhp52" Dec 16 15:08:09 crc kubenswrapper[4728]: I1216 15:08:09.180906 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-fkppc" podStartSLOduration=1.051596459 podStartE2EDuration="11.180887845s" podCreationTimestamp="2025-12-16 15:07:58 +0000 UTC" firstStartedPulling="2025-12-16 15:07:58.742346552 +0000 UTC m=+659.582525536" lastFinishedPulling="2025-12-16 15:08:08.871637948 +0000 UTC m=+669.711816922" observedRunningTime="2025-12-16 15:08:09.180812133 +0000 UTC m=+670.020991197" watchObservedRunningTime="2025-12-16 15:08:09.180887845 +0000 UTC m=+670.021066839" Dec 16 15:08:09 crc kubenswrapper[4728]: I1216 15:08:09.260028 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pr5wl"] Dec 16 15:08:13 crc kubenswrapper[4728]: I1216 15:08:13.605792 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-hd8rz" Dec 16 15:08:19 crc kubenswrapper[4728]: I1216 15:08:19.092773 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-r6qf9" Dec 16 15:08:29 crc kubenswrapper[4728]: E1216 15:08:29.146516 4728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.641s" Dec 16 15:08:34 crc kubenswrapper[4728]: I1216 15:08:34.321203 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-pr5wl" podUID="db457bae-59bc-4ec6-b5dd-8699c5794f76" containerName="console" containerID="cri-o://dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b" gracePeriod=15 Dec 16 15:08:34 crc kubenswrapper[4728]: I1216 15:08:34.951296 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pr5wl_db457bae-59bc-4ec6-b5dd-8699c5794f76/console/0.log" Dec 16 15:08:34 crc kubenswrapper[4728]: I1216 15:08:34.951740 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.063602 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-oauth-config\") pod \"db457bae-59bc-4ec6-b5dd-8699c5794f76\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.063676 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzvzd\" (UniqueName: \"kubernetes.io/projected/db457bae-59bc-4ec6-b5dd-8699c5794f76-kube-api-access-tzvzd\") pod \"db457bae-59bc-4ec6-b5dd-8699c5794f76\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.063778 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-serving-cert\") pod \"db457bae-59bc-4ec6-b5dd-8699c5794f76\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.063837 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-config\") pod \"db457bae-59bc-4ec6-b5dd-8699c5794f76\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.063906 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-service-ca\") pod \"db457bae-59bc-4ec6-b5dd-8699c5794f76\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.063950 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-trusted-ca-bundle\") pod \"db457bae-59bc-4ec6-b5dd-8699c5794f76\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.064030 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-oauth-serving-cert\") pod \"db457bae-59bc-4ec6-b5dd-8699c5794f76\" (UID: \"db457bae-59bc-4ec6-b5dd-8699c5794f76\") " Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.064921 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "db457bae-59bc-4ec6-b5dd-8699c5794f76" (UID: "db457bae-59bc-4ec6-b5dd-8699c5794f76"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.065069 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-service-ca" (OuterVolumeSpecName: "service-ca") pod "db457bae-59bc-4ec6-b5dd-8699c5794f76" (UID: "db457bae-59bc-4ec6-b5dd-8699c5794f76"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.065205 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-config" (OuterVolumeSpecName: "console-config") pod "db457bae-59bc-4ec6-b5dd-8699c5794f76" (UID: "db457bae-59bc-4ec6-b5dd-8699c5794f76"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.065235 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "db457bae-59bc-4ec6-b5dd-8699c5794f76" (UID: "db457bae-59bc-4ec6-b5dd-8699c5794f76"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.070755 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db457bae-59bc-4ec6-b5dd-8699c5794f76-kube-api-access-tzvzd" (OuterVolumeSpecName: "kube-api-access-tzvzd") pod "db457bae-59bc-4ec6-b5dd-8699c5794f76" (UID: "db457bae-59bc-4ec6-b5dd-8699c5794f76"). InnerVolumeSpecName "kube-api-access-tzvzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.075550 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "db457bae-59bc-4ec6-b5dd-8699c5794f76" (UID: "db457bae-59bc-4ec6-b5dd-8699c5794f76"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.075889 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "db457bae-59bc-4ec6-b5dd-8699c5794f76" (UID: "db457bae-59bc-4ec6-b5dd-8699c5794f76"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.165609 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzvzd\" (UniqueName: \"kubernetes.io/projected/db457bae-59bc-4ec6-b5dd-8699c5794f76-kube-api-access-tzvzd\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.165893 4728 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.165905 4728 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.165917 4728 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-console-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.165929 4728 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-service-ca\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.165941 4728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.165951 4728 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db457bae-59bc-4ec6-b5dd-8699c5794f76-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.352675 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pr5wl_db457bae-59bc-4ec6-b5dd-8699c5794f76/console/0.log" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.352722 4728 generic.go:334] "Generic (PLEG): container finished" podID="db457bae-59bc-4ec6-b5dd-8699c5794f76" containerID="dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b" exitCode=2 Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.352754 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pr5wl" event={"ID":"db457bae-59bc-4ec6-b5dd-8699c5794f76","Type":"ContainerDied","Data":"dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b"} Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.352789 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pr5wl" event={"ID":"db457bae-59bc-4ec6-b5dd-8699c5794f76","Type":"ContainerDied","Data":"cfc60d52c1500061a07d9c1f39939fb4a4fe915d8c2ee698ee4cfab05509d566"} Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.352803 4728 scope.go:117] "RemoveContainer" containerID="dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.352790 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pr5wl" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.391196 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pr5wl"] Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.392859 4728 scope.go:117] "RemoveContainer" containerID="dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b" Dec 16 15:08:35 crc kubenswrapper[4728]: E1216 15:08:35.393689 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b\": container with ID starting with dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b not found: ID does not exist" containerID="dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.393721 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b"} err="failed to get container status \"dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b\": rpc error: code = NotFound desc = could not find container \"dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b\": container with ID starting with dfc4687368f5f569cccb75c4963d584da0c48c05f1b40ca8bddec5711588fc6b not found: ID does not exist" Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.395226 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-pr5wl"] Dec 16 15:08:35 crc kubenswrapper[4728]: I1216 15:08:35.515746 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db457bae-59bc-4ec6-b5dd-8699c5794f76" path="/var/lib/kubelet/pods/db457bae-59bc-4ec6-b5dd-8699c5794f76/volumes" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.461631 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk"] Dec 16 15:08:39 crc kubenswrapper[4728]: E1216 15:08:39.463733 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db457bae-59bc-4ec6-b5dd-8699c5794f76" containerName="console" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.463749 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="db457bae-59bc-4ec6-b5dd-8699c5794f76" containerName="console" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.463861 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="db457bae-59bc-4ec6-b5dd-8699c5794f76" containerName="console" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.464652 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.466857 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.474739 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk"] Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.527538 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.527618 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.527666 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vjqw\" (UniqueName: \"kubernetes.io/projected/7fda98d5-5127-49d8-a054-ece045552e27-kube-api-access-7vjqw\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.628868 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vjqw\" (UniqueName: \"kubernetes.io/projected/7fda98d5-5127-49d8-a054-ece045552e27-kube-api-access-7vjqw\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.629027 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.629093 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.629773 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.629796 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.649639 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vjqw\" (UniqueName: \"kubernetes.io/projected/7fda98d5-5127-49d8-a054-ece045552e27-kube-api-access-7vjqw\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.784000 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:39 crc kubenswrapper[4728]: I1216 15:08:39.966642 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk"] Dec 16 15:08:40 crc kubenswrapper[4728]: I1216 15:08:40.388619 4728 generic.go:334] "Generic (PLEG): container finished" podID="7fda98d5-5127-49d8-a054-ece045552e27" containerID="00fc069ea5923d48d1a94a0186c40edc7a13eb5ef6c8dbf7c234dc8d8ef3e70c" exitCode=0 Dec 16 15:08:40 crc kubenswrapper[4728]: I1216 15:08:40.388827 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" event={"ID":"7fda98d5-5127-49d8-a054-ece045552e27","Type":"ContainerDied","Data":"00fc069ea5923d48d1a94a0186c40edc7a13eb5ef6c8dbf7c234dc8d8ef3e70c"} Dec 16 15:08:40 crc kubenswrapper[4728]: I1216 15:08:40.388867 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" event={"ID":"7fda98d5-5127-49d8-a054-ece045552e27","Type":"ContainerStarted","Data":"d35b3e6e944d5bd4addf5eec3350b23aee91f294f91d1d5de6b79c79934ce3d9"} Dec 16 15:08:43 crc kubenswrapper[4728]: I1216 15:08:43.412766 4728 generic.go:334] "Generic (PLEG): container finished" podID="7fda98d5-5127-49d8-a054-ece045552e27" containerID="46e66fb6d380a940d4e437fce95b45f1fbc62f9194ed9a06baa3febeb4ef184e" exitCode=0 Dec 16 15:08:43 crc kubenswrapper[4728]: I1216 15:08:43.412869 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" event={"ID":"7fda98d5-5127-49d8-a054-ece045552e27","Type":"ContainerDied","Data":"46e66fb6d380a940d4e437fce95b45f1fbc62f9194ed9a06baa3febeb4ef184e"} Dec 16 15:08:44 crc kubenswrapper[4728]: I1216 15:08:44.430846 4728 generic.go:334] "Generic (PLEG): container finished" podID="7fda98d5-5127-49d8-a054-ece045552e27" containerID="2f4bbd8418b252ff46a7df1a88c6cfe2887f54028221c89111bc43d0f6153292" exitCode=0 Dec 16 15:08:44 crc kubenswrapper[4728]: I1216 15:08:44.430994 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" event={"ID":"7fda98d5-5127-49d8-a054-ece045552e27","Type":"ContainerDied","Data":"2f4bbd8418b252ff46a7df1a88c6cfe2887f54028221c89111bc43d0f6153292"} Dec 16 15:08:45 crc kubenswrapper[4728]: I1216 15:08:45.725032 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:45 crc kubenswrapper[4728]: I1216 15:08:45.908315 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-util\") pod \"7fda98d5-5127-49d8-a054-ece045552e27\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " Dec 16 15:08:45 crc kubenswrapper[4728]: I1216 15:08:45.908396 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vjqw\" (UniqueName: \"kubernetes.io/projected/7fda98d5-5127-49d8-a054-ece045552e27-kube-api-access-7vjqw\") pod \"7fda98d5-5127-49d8-a054-ece045552e27\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " Dec 16 15:08:45 crc kubenswrapper[4728]: I1216 15:08:45.908442 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-bundle\") pod \"7fda98d5-5127-49d8-a054-ece045552e27\" (UID: \"7fda98d5-5127-49d8-a054-ece045552e27\") " Dec 16 15:08:45 crc kubenswrapper[4728]: I1216 15:08:45.910556 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-bundle" (OuterVolumeSpecName: "bundle") pod "7fda98d5-5127-49d8-a054-ece045552e27" (UID: "7fda98d5-5127-49d8-a054-ece045552e27"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:08:45 crc kubenswrapper[4728]: I1216 15:08:45.920623 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fda98d5-5127-49d8-a054-ece045552e27-kube-api-access-7vjqw" (OuterVolumeSpecName: "kube-api-access-7vjqw") pod "7fda98d5-5127-49d8-a054-ece045552e27" (UID: "7fda98d5-5127-49d8-a054-ece045552e27"). InnerVolumeSpecName "kube-api-access-7vjqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:08:45 crc kubenswrapper[4728]: I1216 15:08:45.921647 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-util" (OuterVolumeSpecName: "util") pod "7fda98d5-5127-49d8-a054-ece045552e27" (UID: "7fda98d5-5127-49d8-a054-ece045552e27"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:08:46 crc kubenswrapper[4728]: I1216 15:08:46.009807 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vjqw\" (UniqueName: \"kubernetes.io/projected/7fda98d5-5127-49d8-a054-ece045552e27-kube-api-access-7vjqw\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:46 crc kubenswrapper[4728]: I1216 15:08:46.009870 4728 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:46 crc kubenswrapper[4728]: I1216 15:08:46.009897 4728 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7fda98d5-5127-49d8-a054-ece045552e27-util\") on node \"crc\" DevicePath \"\"" Dec 16 15:08:46 crc kubenswrapper[4728]: I1216 15:08:46.453661 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" event={"ID":"7fda98d5-5127-49d8-a054-ece045552e27","Type":"ContainerDied","Data":"d35b3e6e944d5bd4addf5eec3350b23aee91f294f91d1d5de6b79c79934ce3d9"} Dec 16 15:08:46 crc kubenswrapper[4728]: I1216 15:08:46.454138 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d35b3e6e944d5bd4addf5eec3350b23aee91f294f91d1d5de6b79c79934ce3d9" Dec 16 15:08:46 crc kubenswrapper[4728]: I1216 15:08:46.453769 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.093796 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-fd5945654-clj75"] Dec 16 15:08:57 crc kubenswrapper[4728]: E1216 15:08:57.095398 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fda98d5-5127-49d8-a054-ece045552e27" containerName="pull" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.095501 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fda98d5-5127-49d8-a054-ece045552e27" containerName="pull" Dec 16 15:08:57 crc kubenswrapper[4728]: E1216 15:08:57.095564 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fda98d5-5127-49d8-a054-ece045552e27" containerName="util" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.095618 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fda98d5-5127-49d8-a054-ece045552e27" containerName="util" Dec 16 15:08:57 crc kubenswrapper[4728]: E1216 15:08:57.095672 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fda98d5-5127-49d8-a054-ece045552e27" containerName="extract" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.095726 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fda98d5-5127-49d8-a054-ece045552e27" containerName="extract" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.095871 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fda98d5-5127-49d8-a054-ece045552e27" containerName="extract" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.096277 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.098136 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.098467 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.098505 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.098773 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-p52sc" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.098844 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.136242 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-fd5945654-clj75"] Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.240900 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2cmf\" (UniqueName: \"kubernetes.io/projected/d1b1e578-b0a6-446b-90d1-7df5d4d4a43a-kube-api-access-m2cmf\") pod \"metallb-operator-controller-manager-fd5945654-clj75\" (UID: \"d1b1e578-b0a6-446b-90d1-7df5d4d4a43a\") " pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.240996 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1b1e578-b0a6-446b-90d1-7df5d4d4a43a-apiservice-cert\") pod \"metallb-operator-controller-manager-fd5945654-clj75\" (UID: \"d1b1e578-b0a6-446b-90d1-7df5d4d4a43a\") " pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.241029 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1b1e578-b0a6-446b-90d1-7df5d4d4a43a-webhook-cert\") pod \"metallb-operator-controller-manager-fd5945654-clj75\" (UID: \"d1b1e578-b0a6-446b-90d1-7df5d4d4a43a\") " pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.313264 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx"] Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.314112 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.315880 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.316076 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.316094 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9twcz" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.336293 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx"] Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.342077 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1b1e578-b0a6-446b-90d1-7df5d4d4a43a-apiservice-cert\") pod \"metallb-operator-controller-manager-fd5945654-clj75\" (UID: \"d1b1e578-b0a6-446b-90d1-7df5d4d4a43a\") " pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.342132 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9-apiservice-cert\") pod \"metallb-operator-webhook-server-6dfbdf4c69-n5ksx\" (UID: \"55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9\") " pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.342161 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1b1e578-b0a6-446b-90d1-7df5d4d4a43a-webhook-cert\") pod \"metallb-operator-controller-manager-fd5945654-clj75\" (UID: \"d1b1e578-b0a6-446b-90d1-7df5d4d4a43a\") " pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.342211 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9-webhook-cert\") pod \"metallb-operator-webhook-server-6dfbdf4c69-n5ksx\" (UID: \"55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9\") " pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.342252 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2cmf\" (UniqueName: \"kubernetes.io/projected/d1b1e578-b0a6-446b-90d1-7df5d4d4a43a-kube-api-access-m2cmf\") pod \"metallb-operator-controller-manager-fd5945654-clj75\" (UID: \"d1b1e578-b0a6-446b-90d1-7df5d4d4a43a\") " pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.342278 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zpsx\" (UniqueName: \"kubernetes.io/projected/55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9-kube-api-access-2zpsx\") pod \"metallb-operator-webhook-server-6dfbdf4c69-n5ksx\" (UID: \"55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9\") " pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.347808 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1b1e578-b0a6-446b-90d1-7df5d4d4a43a-apiservice-cert\") pod \"metallb-operator-controller-manager-fd5945654-clj75\" (UID: \"d1b1e578-b0a6-446b-90d1-7df5d4d4a43a\") " pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.348798 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1b1e578-b0a6-446b-90d1-7df5d4d4a43a-webhook-cert\") pod \"metallb-operator-controller-manager-fd5945654-clj75\" (UID: \"d1b1e578-b0a6-446b-90d1-7df5d4d4a43a\") " pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.376375 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2cmf\" (UniqueName: \"kubernetes.io/projected/d1b1e578-b0a6-446b-90d1-7df5d4d4a43a-kube-api-access-m2cmf\") pod \"metallb-operator-controller-manager-fd5945654-clj75\" (UID: \"d1b1e578-b0a6-446b-90d1-7df5d4d4a43a\") " pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.409229 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.443256 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9-webhook-cert\") pod \"metallb-operator-webhook-server-6dfbdf4c69-n5ksx\" (UID: \"55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9\") " pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.443335 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zpsx\" (UniqueName: \"kubernetes.io/projected/55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9-kube-api-access-2zpsx\") pod \"metallb-operator-webhook-server-6dfbdf4c69-n5ksx\" (UID: \"55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9\") " pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.443399 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9-apiservice-cert\") pod \"metallb-operator-webhook-server-6dfbdf4c69-n5ksx\" (UID: \"55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9\") " pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.446659 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9-apiservice-cert\") pod \"metallb-operator-webhook-server-6dfbdf4c69-n5ksx\" (UID: \"55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9\") " pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.446907 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9-webhook-cert\") pod \"metallb-operator-webhook-server-6dfbdf4c69-n5ksx\" (UID: \"55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9\") " pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.471911 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zpsx\" (UniqueName: \"kubernetes.io/projected/55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9-kube-api-access-2zpsx\") pod \"metallb-operator-webhook-server-6dfbdf4c69-n5ksx\" (UID: \"55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9\") " pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.626044 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.770610 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-fd5945654-clj75"] Dec 16 15:08:57 crc kubenswrapper[4728]: I1216 15:08:57.965564 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx"] Dec 16 15:08:57 crc kubenswrapper[4728]: W1216 15:08:57.968920 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55c8e87a_d9fe_4f1c_af42_0dee2e0f3fd9.slice/crio-5881fd642fda59adff6ba60b116f8e0e394e1cf2de90d7566fff662c8e6e77ba WatchSource:0}: Error finding container 5881fd642fda59adff6ba60b116f8e0e394e1cf2de90d7566fff662c8e6e77ba: Status 404 returned error can't find the container with id 5881fd642fda59adff6ba60b116f8e0e394e1cf2de90d7566fff662c8e6e77ba Dec 16 15:08:58 crc kubenswrapper[4728]: I1216 15:08:58.520610 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" event={"ID":"d1b1e578-b0a6-446b-90d1-7df5d4d4a43a","Type":"ContainerStarted","Data":"9a9ea38482bae4a38782db6a6ca4126bc6bf960469b6bbcd1c1795106fceaad6"} Dec 16 15:08:58 crc kubenswrapper[4728]: I1216 15:08:58.522204 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" event={"ID":"55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9","Type":"ContainerStarted","Data":"5881fd642fda59adff6ba60b116f8e0e394e1cf2de90d7566fff662c8e6e77ba"} Dec 16 15:09:04 crc kubenswrapper[4728]: I1216 15:09:04.560486 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" event={"ID":"55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9","Type":"ContainerStarted","Data":"18713b37f53f0289b7310e30c6c5645184d9f1d53baed738f37875ea3dfb4f92"} Dec 16 15:09:04 crc kubenswrapper[4728]: I1216 15:09:04.561218 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:09:04 crc kubenswrapper[4728]: I1216 15:09:04.562121 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" event={"ID":"d1b1e578-b0a6-446b-90d1-7df5d4d4a43a","Type":"ContainerStarted","Data":"f4a8e62a5a67f64ef92942deec691a1567482526d21c2feb96e9cf07b341da32"} Dec 16 15:09:04 crc kubenswrapper[4728]: I1216 15:09:04.562283 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:09:04 crc kubenswrapper[4728]: I1216 15:09:04.580576 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" podStartSLOduration=1.2118305440000001 podStartE2EDuration="7.580558278s" podCreationTimestamp="2025-12-16 15:08:57 +0000 UTC" firstStartedPulling="2025-12-16 15:08:57.973602971 +0000 UTC m=+718.813781955" lastFinishedPulling="2025-12-16 15:09:04.342330705 +0000 UTC m=+725.182509689" observedRunningTime="2025-12-16 15:09:04.577138754 +0000 UTC m=+725.417317748" watchObservedRunningTime="2025-12-16 15:09:04.580558278 +0000 UTC m=+725.420737262" Dec 16 15:09:04 crc kubenswrapper[4728]: I1216 15:09:04.603452 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" podStartSLOduration=1.063665087 podStartE2EDuration="7.603429049s" podCreationTimestamp="2025-12-16 15:08:57 +0000 UTC" firstStartedPulling="2025-12-16 15:08:57.785832851 +0000 UTC m=+718.626011835" lastFinishedPulling="2025-12-16 15:09:04.325596813 +0000 UTC m=+725.165775797" observedRunningTime="2025-12-16 15:09:04.598942185 +0000 UTC m=+725.439121189" watchObservedRunningTime="2025-12-16 15:09:04.603429049 +0000 UTC m=+725.443608033" Dec 16 15:09:17 crc kubenswrapper[4728]: I1216 15:09:17.631775 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6dfbdf4c69-n5ksx" Dec 16 15:09:36 crc kubenswrapper[4728]: I1216 15:09:36.069811 4728 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 16 15:09:37 crc kubenswrapper[4728]: I1216 15:09:37.413885 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-fd5945654-clj75" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.164690 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq"] Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.165736 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.171794 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-vwbfc"] Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.174138 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.175508 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq"] Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.177038 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gr5zt" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.177320 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.179290 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.188943 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.247506 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-872z5"] Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.248312 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.250556 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.251254 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.251379 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jqd42" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.255566 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.289543 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-x5g2t"] Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.291084 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.294999 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-x5g2t"] Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.295281 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366325 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w7rh\" (UniqueName: \"kubernetes.io/projected/bd55b5d2-c827-4b76-bd1e-e1c033737650-kube-api-access-4w7rh\") pod \"controller-5bddd4b946-x5g2t\" (UID: \"bd55b5d2-c827-4b76-bd1e-e1c033737650\") " pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366367 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/129197cb-b920-4ccb-870a-b3b7aabc5928-metrics-certs\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366386 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd55b5d2-c827-4b76-bd1e-e1c033737650-metrics-certs\") pod \"controller-5bddd4b946-x5g2t\" (UID: \"bd55b5d2-c827-4b76-bd1e-e1c033737650\") " pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366418 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa56798-790e-42c2-98af-9e0f7313603c-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-5w4gq\" (UID: \"afa56798-790e-42c2-98af-9e0f7313603c\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366443 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd55b5d2-c827-4b76-bd1e-e1c033737650-cert\") pod \"controller-5bddd4b946-x5g2t\" (UID: \"bd55b5d2-c827-4b76-bd1e-e1c033737650\") " pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366537 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmczz\" (UniqueName: \"kubernetes.io/projected/0c9a8885-9664-4048-bce4-8fc1cab033d8-kube-api-access-jmczz\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366568 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0c9a8885-9664-4048-bce4-8fc1cab033d8-metallb-excludel2\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366601 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-frr-conf\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366615 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-metrics-certs\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366634 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/129197cb-b920-4ccb-870a-b3b7aabc5928-frr-startup\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366648 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vwhp\" (UniqueName: \"kubernetes.io/projected/129197cb-b920-4ccb-870a-b3b7aabc5928-kube-api-access-2vwhp\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366664 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-frr-sockets\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366680 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-metrics\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366696 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-reloader\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366709 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plbsh\" (UniqueName: \"kubernetes.io/projected/afa56798-790e-42c2-98af-9e0f7313603c-kube-api-access-plbsh\") pod \"frr-k8s-webhook-server-7784b6fcf-5w4gq\" (UID: \"afa56798-790e-42c2-98af-9e0f7313603c\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.366727 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-memberlist\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.467463 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-metrics-certs\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468268 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/129197cb-b920-4ccb-870a-b3b7aabc5928-frr-startup\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468306 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vwhp\" (UniqueName: \"kubernetes.io/projected/129197cb-b920-4ccb-870a-b3b7aabc5928-kube-api-access-2vwhp\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468326 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-frr-sockets\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468340 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-metrics\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468374 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-reloader\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468391 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plbsh\" (UniqueName: \"kubernetes.io/projected/afa56798-790e-42c2-98af-9e0f7313603c-kube-api-access-plbsh\") pod \"frr-k8s-webhook-server-7784b6fcf-5w4gq\" (UID: \"afa56798-790e-42c2-98af-9e0f7313603c\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468435 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-memberlist\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468459 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w7rh\" (UniqueName: \"kubernetes.io/projected/bd55b5d2-c827-4b76-bd1e-e1c033737650-kube-api-access-4w7rh\") pod \"controller-5bddd4b946-x5g2t\" (UID: \"bd55b5d2-c827-4b76-bd1e-e1c033737650\") " pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468484 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/129197cb-b920-4ccb-870a-b3b7aabc5928-metrics-certs\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468517 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd55b5d2-c827-4b76-bd1e-e1c033737650-metrics-certs\") pod \"controller-5bddd4b946-x5g2t\" (UID: \"bd55b5d2-c827-4b76-bd1e-e1c033737650\") " pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468537 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa56798-790e-42c2-98af-9e0f7313603c-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-5w4gq\" (UID: \"afa56798-790e-42c2-98af-9e0f7313603c\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468590 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd55b5d2-c827-4b76-bd1e-e1c033737650-cert\") pod \"controller-5bddd4b946-x5g2t\" (UID: \"bd55b5d2-c827-4b76-bd1e-e1c033737650\") " pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468613 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmczz\" (UniqueName: \"kubernetes.io/projected/0c9a8885-9664-4048-bce4-8fc1cab033d8-kube-api-access-jmczz\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468630 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0c9a8885-9664-4048-bce4-8fc1cab033d8-metallb-excludel2\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468669 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-frr-conf\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.468995 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-frr-conf\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.469575 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/129197cb-b920-4ccb-870a-b3b7aabc5928-frr-startup\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.469643 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-reloader\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.469696 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-frr-sockets\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.469819 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/129197cb-b920-4ccb-870a-b3b7aabc5928-metrics\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: E1216 15:09:38.469964 4728 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 16 15:09:38 crc kubenswrapper[4728]: E1216 15:09:38.470108 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-memberlist podName:0c9a8885-9664-4048-bce4-8fc1cab033d8 nodeName:}" failed. No retries permitted until 2025-12-16 15:09:38.970084108 +0000 UTC m=+759.810263092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-memberlist") pod "speaker-872z5" (UID: "0c9a8885-9664-4048-bce4-8fc1cab033d8") : secret "metallb-memberlist" not found Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.470886 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0c9a8885-9664-4048-bce4-8fc1cab033d8-metallb-excludel2\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.471514 4728 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.473963 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/129197cb-b920-4ccb-870a-b3b7aabc5928-metrics-certs\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.474902 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-metrics-certs\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.476601 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd55b5d2-c827-4b76-bd1e-e1c033737650-metrics-certs\") pod \"controller-5bddd4b946-x5g2t\" (UID: \"bd55b5d2-c827-4b76-bd1e-e1c033737650\") " pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.476601 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa56798-790e-42c2-98af-9e0f7313603c-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-5w4gq\" (UID: \"afa56798-790e-42c2-98af-9e0f7313603c\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.484886 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd55b5d2-c827-4b76-bd1e-e1c033737650-cert\") pod \"controller-5bddd4b946-x5g2t\" (UID: \"bd55b5d2-c827-4b76-bd1e-e1c033737650\") " pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.486352 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plbsh\" (UniqueName: \"kubernetes.io/projected/afa56798-790e-42c2-98af-9e0f7313603c-kube-api-access-plbsh\") pod \"frr-k8s-webhook-server-7784b6fcf-5w4gq\" (UID: \"afa56798-790e-42c2-98af-9e0f7313603c\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.487901 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vwhp\" (UniqueName: \"kubernetes.io/projected/129197cb-b920-4ccb-870a-b3b7aabc5928-kube-api-access-2vwhp\") pod \"frr-k8s-vwbfc\" (UID: \"129197cb-b920-4ccb-870a-b3b7aabc5928\") " pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.490885 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.496658 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmczz\" (UniqueName: \"kubernetes.io/projected/0c9a8885-9664-4048-bce4-8fc1cab033d8-kube-api-access-jmczz\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.500073 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w7rh\" (UniqueName: \"kubernetes.io/projected/bd55b5d2-c827-4b76-bd1e-e1c033737650-kube-api-access-4w7rh\") pod \"controller-5bddd4b946-x5g2t\" (UID: \"bd55b5d2-c827-4b76-bd1e-e1c033737650\") " pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.509802 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.613314 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.713676 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq"] Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.778169 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" event={"ID":"afa56798-790e-42c2-98af-9e0f7313603c","Type":"ContainerStarted","Data":"8aea9b467cad7b870b8696a75ad427bd25eda2ba1bf2bccd37d779e8dc213aef"} Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.779582 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vwbfc" event={"ID":"129197cb-b920-4ccb-870a-b3b7aabc5928","Type":"ContainerStarted","Data":"916cc5e08da220ec51dd259d3fa59dfce83faaa61dff0a8a69ddd65bba19fb0e"} Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.890740 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-x5g2t"] Dec 16 15:09:38 crc kubenswrapper[4728]: W1216 15:09:38.895116 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd55b5d2_c827_4b76_bd1e_e1c033737650.slice/crio-c1148be0e5b54acdb71b981b3838842af1bda692f40f49a38968bb1d2fc3b5f9 WatchSource:0}: Error finding container c1148be0e5b54acdb71b981b3838842af1bda692f40f49a38968bb1d2fc3b5f9: Status 404 returned error can't find the container with id c1148be0e5b54acdb71b981b3838842af1bda692f40f49a38968bb1d2fc3b5f9 Dec 16 15:09:38 crc kubenswrapper[4728]: I1216 15:09:38.987091 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-memberlist\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:38 crc kubenswrapper[4728]: E1216 15:09:38.987280 4728 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 16 15:09:38 crc kubenswrapper[4728]: E1216 15:09:38.987335 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-memberlist podName:0c9a8885-9664-4048-bce4-8fc1cab033d8 nodeName:}" failed. No retries permitted until 2025-12-16 15:09:39.98731644 +0000 UTC m=+760.827495424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-memberlist") pod "speaker-872z5" (UID: "0c9a8885-9664-4048-bce4-8fc1cab033d8") : secret "metallb-memberlist" not found Dec 16 15:09:39 crc kubenswrapper[4728]: I1216 15:09:39.785809 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-x5g2t" event={"ID":"bd55b5d2-c827-4b76-bd1e-e1c033737650","Type":"ContainerStarted","Data":"95f9d2acd9bf6e14b5bb0aa73f174405ddba9a5f52ad7e9a6b9d93472128bcf9"} Dec 16 15:09:39 crc kubenswrapper[4728]: I1216 15:09:39.786048 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-x5g2t" event={"ID":"bd55b5d2-c827-4b76-bd1e-e1c033737650","Type":"ContainerStarted","Data":"fd415df2d02d6ba942af321657148e451c154b06f0c8fed0c1f031688648bde2"} Dec 16 15:09:39 crc kubenswrapper[4728]: I1216 15:09:39.786059 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-x5g2t" event={"ID":"bd55b5d2-c827-4b76-bd1e-e1c033737650","Type":"ContainerStarted","Data":"c1148be0e5b54acdb71b981b3838842af1bda692f40f49a38968bb1d2fc3b5f9"} Dec 16 15:09:39 crc kubenswrapper[4728]: I1216 15:09:39.786073 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:39 crc kubenswrapper[4728]: I1216 15:09:39.801790 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-x5g2t" podStartSLOduration=1.801776173 podStartE2EDuration="1.801776173s" podCreationTimestamp="2025-12-16 15:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:09:39.799000816 +0000 UTC m=+760.639179800" watchObservedRunningTime="2025-12-16 15:09:39.801776173 +0000 UTC m=+760.641955157" Dec 16 15:09:39 crc kubenswrapper[4728]: I1216 15:09:39.998797 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-memberlist\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:40 crc kubenswrapper[4728]: I1216 15:09:40.004505 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0c9a8885-9664-4048-bce4-8fc1cab033d8-memberlist\") pod \"speaker-872z5\" (UID: \"0c9a8885-9664-4048-bce4-8fc1cab033d8\") " pod="metallb-system/speaker-872z5" Dec 16 15:09:40 crc kubenswrapper[4728]: I1216 15:09:40.064881 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-872z5" Dec 16 15:09:40 crc kubenswrapper[4728]: I1216 15:09:40.795332 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-872z5" event={"ID":"0c9a8885-9664-4048-bce4-8fc1cab033d8","Type":"ContainerStarted","Data":"1c3bcd5a6d37dfeaa39a666b731e1a21e1878f82da401daf2b64ff8608ab2d46"} Dec 16 15:09:40 crc kubenswrapper[4728]: I1216 15:09:40.795680 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-872z5" event={"ID":"0c9a8885-9664-4048-bce4-8fc1cab033d8","Type":"ContainerStarted","Data":"4d54927e9b0d2ef2fa74d5f60a562700b8fbda687e04f36886facf3da2c6e979"} Dec 16 15:09:40 crc kubenswrapper[4728]: I1216 15:09:40.795699 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-872z5" event={"ID":"0c9a8885-9664-4048-bce4-8fc1cab033d8","Type":"ContainerStarted","Data":"f46e24aece50cd4b83483f74dcd4e0178b821ee42c62359e82bb0a50182c2d97"} Dec 16 15:09:40 crc kubenswrapper[4728]: I1216 15:09:40.795923 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-872z5" Dec 16 15:09:45 crc kubenswrapper[4728]: I1216 15:09:45.830426 4728 generic.go:334] "Generic (PLEG): container finished" podID="129197cb-b920-4ccb-870a-b3b7aabc5928" containerID="e07952594374109372c6ca359c32571bf382976bae444708b8ffc134b31856b8" exitCode=0 Dec 16 15:09:45 crc kubenswrapper[4728]: I1216 15:09:45.830633 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vwbfc" event={"ID":"129197cb-b920-4ccb-870a-b3b7aabc5928","Type":"ContainerDied","Data":"e07952594374109372c6ca359c32571bf382976bae444708b8ffc134b31856b8"} Dec 16 15:09:45 crc kubenswrapper[4728]: I1216 15:09:45.834231 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" event={"ID":"afa56798-790e-42c2-98af-9e0f7313603c","Type":"ContainerStarted","Data":"f5bc257f4b7f13cc2adc9b3499acceb1a935397ee866efd86b55f3c64788fc0c"} Dec 16 15:09:45 crc kubenswrapper[4728]: I1216 15:09:45.834473 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" Dec 16 15:09:45 crc kubenswrapper[4728]: I1216 15:09:45.869259 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-872z5" podStartSLOduration=7.869231163 podStartE2EDuration="7.869231163s" podCreationTimestamp="2025-12-16 15:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:09:40.818604339 +0000 UTC m=+761.658783333" watchObservedRunningTime="2025-12-16 15:09:45.869231163 +0000 UTC m=+766.709410197" Dec 16 15:09:45 crc kubenswrapper[4728]: I1216 15:09:45.883931 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" podStartSLOduration=1.307139584 podStartE2EDuration="7.883913348s" podCreationTimestamp="2025-12-16 15:09:38 +0000 UTC" firstStartedPulling="2025-12-16 15:09:38.75537385 +0000 UTC m=+759.595552834" lastFinishedPulling="2025-12-16 15:09:45.332147574 +0000 UTC m=+766.172326598" observedRunningTime="2025-12-16 15:09:45.882788617 +0000 UTC m=+766.722967601" watchObservedRunningTime="2025-12-16 15:09:45.883913348 +0000 UTC m=+766.724092332" Dec 16 15:09:46 crc kubenswrapper[4728]: I1216 15:09:46.846597 4728 generic.go:334] "Generic (PLEG): container finished" podID="129197cb-b920-4ccb-870a-b3b7aabc5928" containerID="b1db0518eb6965f2bcee6d29db6eb931f54d8fa0067d0d3df20204464c63a5a3" exitCode=0 Dec 16 15:09:46 crc kubenswrapper[4728]: I1216 15:09:46.846732 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vwbfc" event={"ID":"129197cb-b920-4ccb-870a-b3b7aabc5928","Type":"ContainerDied","Data":"b1db0518eb6965f2bcee6d29db6eb931f54d8fa0067d0d3df20204464c63a5a3"} Dec 16 15:09:47 crc kubenswrapper[4728]: I1216 15:09:47.859302 4728 generic.go:334] "Generic (PLEG): container finished" podID="129197cb-b920-4ccb-870a-b3b7aabc5928" containerID="4e8a02ea8b46d0708733ef74338cf6d0b1b9ecc50061c802fc87b1347feb247c" exitCode=0 Dec 16 15:09:47 crc kubenswrapper[4728]: I1216 15:09:47.859443 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vwbfc" event={"ID":"129197cb-b920-4ccb-870a-b3b7aabc5928","Type":"ContainerDied","Data":"4e8a02ea8b46d0708733ef74338cf6d0b1b9ecc50061c802fc87b1347feb247c"} Dec 16 15:09:48 crc kubenswrapper[4728]: I1216 15:09:48.617256 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-x5g2t" Dec 16 15:09:48 crc kubenswrapper[4728]: I1216 15:09:48.868703 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vwbfc" event={"ID":"129197cb-b920-4ccb-870a-b3b7aabc5928","Type":"ContainerStarted","Data":"d14c52ec90a89a3673e86abe612a519bd36be3b46d3dc6e70efecfec0a662f8c"} Dec 16 15:09:48 crc kubenswrapper[4728]: I1216 15:09:48.868751 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vwbfc" event={"ID":"129197cb-b920-4ccb-870a-b3b7aabc5928","Type":"ContainerStarted","Data":"2b9f6cb9b2cb2088fdd7422142e1be7effa218bce06a2844b572c776ffa1d90f"} Dec 16 15:09:48 crc kubenswrapper[4728]: I1216 15:09:48.868763 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vwbfc" event={"ID":"129197cb-b920-4ccb-870a-b3b7aabc5928","Type":"ContainerStarted","Data":"2984bb3e367f5e2155ae3ca6b62d959734018f355d8056b3d24ae149bc855630"} Dec 16 15:09:48 crc kubenswrapper[4728]: I1216 15:09:48.868772 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vwbfc" event={"ID":"129197cb-b920-4ccb-870a-b3b7aabc5928","Type":"ContainerStarted","Data":"c706ab0968458b37fbe787f4e2706000a6f468f238fa4c837504b447db1098d5"} Dec 16 15:09:48 crc kubenswrapper[4728]: I1216 15:09:48.868780 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vwbfc" event={"ID":"129197cb-b920-4ccb-870a-b3b7aabc5928","Type":"ContainerStarted","Data":"5b026ab4228f56ba2d9d636eff9c1fff44dabe2c4a448f55ca4479c62b29ee37"} Dec 16 15:09:49 crc kubenswrapper[4728]: I1216 15:09:49.882319 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vwbfc" event={"ID":"129197cb-b920-4ccb-870a-b3b7aabc5928","Type":"ContainerStarted","Data":"6afd1587f32fd82b09e33e1869de571a173cee987d3952822ed1eb5e8fef071f"} Dec 16 15:09:49 crc kubenswrapper[4728]: I1216 15:09:49.882584 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:49 crc kubenswrapper[4728]: I1216 15:09:49.925649 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-vwbfc" podStartSLOduration=5.261362788 podStartE2EDuration="11.925624035s" podCreationTimestamp="2025-12-16 15:09:38 +0000 UTC" firstStartedPulling="2025-12-16 15:09:38.645461648 +0000 UTC m=+759.485640632" lastFinishedPulling="2025-12-16 15:09:45.309722885 +0000 UTC m=+766.149901879" observedRunningTime="2025-12-16 15:09:49.914192609 +0000 UTC m=+770.754371643" watchObservedRunningTime="2025-12-16 15:09:49.925624035 +0000 UTC m=+770.765803059" Dec 16 15:09:50 crc kubenswrapper[4728]: I1216 15:09:50.070305 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-872z5" Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.219978 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5w5dq"] Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.220889 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5w5dq" Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.224243 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-plr5l" Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.235915 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.235932 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.243906 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5w5dq"] Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.300231 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8hgz\" (UniqueName: \"kubernetes.io/projected/61645953-0502-4766-aa0f-c1a7a97c9258-kube-api-access-s8hgz\") pod \"openstack-operator-index-5w5dq\" (UID: \"61645953-0502-4766-aa0f-c1a7a97c9258\") " pod="openstack-operators/openstack-operator-index-5w5dq" Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.403822 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8hgz\" (UniqueName: \"kubernetes.io/projected/61645953-0502-4766-aa0f-c1a7a97c9258-kube-api-access-s8hgz\") pod \"openstack-operator-index-5w5dq\" (UID: \"61645953-0502-4766-aa0f-c1a7a97c9258\") " pod="openstack-operators/openstack-operator-index-5w5dq" Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.425021 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8hgz\" (UniqueName: \"kubernetes.io/projected/61645953-0502-4766-aa0f-c1a7a97c9258-kube-api-access-s8hgz\") pod \"openstack-operator-index-5w5dq\" (UID: \"61645953-0502-4766-aa0f-c1a7a97c9258\") " pod="openstack-operators/openstack-operator-index-5w5dq" Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.513605 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.560741 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.587815 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5w5dq" Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.804864 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5w5dq"] Dec 16 15:09:53 crc kubenswrapper[4728]: I1216 15:09:53.910578 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5w5dq" event={"ID":"61645953-0502-4766-aa0f-c1a7a97c9258","Type":"ContainerStarted","Data":"10aa85fc5b2fa64d77cad58c2ee00261188d41b6699cef8314270ea8000dfa0a"} Dec 16 15:09:57 crc kubenswrapper[4728]: I1216 15:09:57.572739 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5w5dq"] Dec 16 15:09:57 crc kubenswrapper[4728]: I1216 15:09:57.959820 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5w5dq" event={"ID":"61645953-0502-4766-aa0f-c1a7a97c9258","Type":"ContainerStarted","Data":"436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3"} Dec 16 15:09:57 crc kubenswrapper[4728]: I1216 15:09:57.959996 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5w5dq" podUID="61645953-0502-4766-aa0f-c1a7a97c9258" containerName="registry-server" containerID="cri-o://436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3" gracePeriod=2 Dec 16 15:09:57 crc kubenswrapper[4728]: I1216 15:09:57.981066 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5w5dq" podStartSLOduration=1.08909669 podStartE2EDuration="4.981042825s" podCreationTimestamp="2025-12-16 15:09:53 +0000 UTC" firstStartedPulling="2025-12-16 15:09:53.813607179 +0000 UTC m=+774.653786183" lastFinishedPulling="2025-12-16 15:09:57.705553324 +0000 UTC m=+778.545732318" observedRunningTime="2025-12-16 15:09:57.978191677 +0000 UTC m=+778.818370681" watchObservedRunningTime="2025-12-16 15:09:57.981042825 +0000 UTC m=+778.821221839" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.212617 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9hjx2"] Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.214124 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9hjx2" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.229091 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9hjx2"] Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.375831 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj55g\" (UniqueName: \"kubernetes.io/projected/3818a60e-feb9-4ae0-a15a-48c59870b921-kube-api-access-nj55g\") pod \"openstack-operator-index-9hjx2\" (UID: \"3818a60e-feb9-4ae0-a15a-48c59870b921\") " pod="openstack-operators/openstack-operator-index-9hjx2" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.459480 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5w5dq" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.477800 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj55g\" (UniqueName: \"kubernetes.io/projected/3818a60e-feb9-4ae0-a15a-48c59870b921-kube-api-access-nj55g\") pod \"openstack-operator-index-9hjx2\" (UID: \"3818a60e-feb9-4ae0-a15a-48c59870b921\") " pod="openstack-operators/openstack-operator-index-9hjx2" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.497772 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-5w4gq" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.504377 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj55g\" (UniqueName: \"kubernetes.io/projected/3818a60e-feb9-4ae0-a15a-48c59870b921-kube-api-access-nj55g\") pod \"openstack-operator-index-9hjx2\" (UID: \"3818a60e-feb9-4ae0-a15a-48c59870b921\") " pod="openstack-operators/openstack-operator-index-9hjx2" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.525399 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-vwbfc" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.549938 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9hjx2" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.579136 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8hgz\" (UniqueName: \"kubernetes.io/projected/61645953-0502-4766-aa0f-c1a7a97c9258-kube-api-access-s8hgz\") pod \"61645953-0502-4766-aa0f-c1a7a97c9258\" (UID: \"61645953-0502-4766-aa0f-c1a7a97c9258\") " Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.582811 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61645953-0502-4766-aa0f-c1a7a97c9258-kube-api-access-s8hgz" (OuterVolumeSpecName: "kube-api-access-s8hgz") pod "61645953-0502-4766-aa0f-c1a7a97c9258" (UID: "61645953-0502-4766-aa0f-c1a7a97c9258"). InnerVolumeSpecName "kube-api-access-s8hgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.680573 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8hgz\" (UniqueName: \"kubernetes.io/projected/61645953-0502-4766-aa0f-c1a7a97c9258-kube-api-access-s8hgz\") on node \"crc\" DevicePath \"\"" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.782953 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9hjx2"] Dec 16 15:09:58 crc kubenswrapper[4728]: W1216 15:09:58.791461 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3818a60e_feb9_4ae0_a15a_48c59870b921.slice/crio-d091e27eddb30b2ad7aba5ac6625f8d6b6989139eb7a1b9ed33691f9472e0985 WatchSource:0}: Error finding container d091e27eddb30b2ad7aba5ac6625f8d6b6989139eb7a1b9ed33691f9472e0985: Status 404 returned error can't find the container with id d091e27eddb30b2ad7aba5ac6625f8d6b6989139eb7a1b9ed33691f9472e0985 Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.975207 4728 generic.go:334] "Generic (PLEG): container finished" podID="61645953-0502-4766-aa0f-c1a7a97c9258" containerID="436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3" exitCode=0 Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.975373 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5w5dq" event={"ID":"61645953-0502-4766-aa0f-c1a7a97c9258","Type":"ContainerDied","Data":"436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3"} Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.976073 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5w5dq" event={"ID":"61645953-0502-4766-aa0f-c1a7a97c9258","Type":"ContainerDied","Data":"10aa85fc5b2fa64d77cad58c2ee00261188d41b6699cef8314270ea8000dfa0a"} Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.975429 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5w5dq" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.976132 4728 scope.go:117] "RemoveContainer" containerID="436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.978982 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9hjx2" event={"ID":"3818a60e-feb9-4ae0-a15a-48c59870b921","Type":"ContainerStarted","Data":"d091e27eddb30b2ad7aba5ac6625f8d6b6989139eb7a1b9ed33691f9472e0985"} Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.993250 4728 scope.go:117] "RemoveContainer" containerID="436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3" Dec 16 15:09:58 crc kubenswrapper[4728]: E1216 15:09:58.993945 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3\": container with ID starting with 436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3 not found: ID does not exist" containerID="436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3" Dec 16 15:09:58 crc kubenswrapper[4728]: I1216 15:09:58.994032 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3"} err="failed to get container status \"436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3\": rpc error: code = NotFound desc = could not find container \"436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3\": container with ID starting with 436f548a72a68f13568170a4bec41661067f81a7b30ad7b66a7478dae43dbfe3 not found: ID does not exist" Dec 16 15:09:59 crc kubenswrapper[4728]: I1216 15:09:59.026454 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5w5dq"] Dec 16 15:09:59 crc kubenswrapper[4728]: I1216 15:09:59.031156 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5w5dq"] Dec 16 15:09:59 crc kubenswrapper[4728]: I1216 15:09:59.527258 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61645953-0502-4766-aa0f-c1a7a97c9258" path="/var/lib/kubelet/pods/61645953-0502-4766-aa0f-c1a7a97c9258/volumes" Dec 16 15:09:59 crc kubenswrapper[4728]: I1216 15:09:59.987920 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9hjx2" event={"ID":"3818a60e-feb9-4ae0-a15a-48c59870b921","Type":"ContainerStarted","Data":"1ae14e72130d7bc2e01920bf192c1f430ad0d78cfaacef23b62ebfb58fc24c13"} Dec 16 15:10:00 crc kubenswrapper[4728]: I1216 15:10:00.000895 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9hjx2" podStartSLOduration=1.946746411 podStartE2EDuration="2.000877345s" podCreationTimestamp="2025-12-16 15:09:58 +0000 UTC" firstStartedPulling="2025-12-16 15:09:58.796263708 +0000 UTC m=+779.636442692" lastFinishedPulling="2025-12-16 15:09:58.850394642 +0000 UTC m=+779.690573626" observedRunningTime="2025-12-16 15:10:00.000230428 +0000 UTC m=+780.840409422" watchObservedRunningTime="2025-12-16 15:10:00.000877345 +0000 UTC m=+780.841056329" Dec 16 15:10:08 crc kubenswrapper[4728]: I1216 15:10:08.550074 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-9hjx2" Dec 16 15:10:08 crc kubenswrapper[4728]: I1216 15:10:08.550860 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-9hjx2" Dec 16 15:10:08 crc kubenswrapper[4728]: I1216 15:10:08.595392 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-9hjx2" Dec 16 15:10:08 crc kubenswrapper[4728]: I1216 15:10:08.819001 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:10:08 crc kubenswrapper[4728]: I1216 15:10:08.819086 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:10:09 crc kubenswrapper[4728]: I1216 15:10:09.102355 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-9hjx2" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.036622 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp"] Dec 16 15:10:16 crc kubenswrapper[4728]: E1216 15:10:16.039631 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61645953-0502-4766-aa0f-c1a7a97c9258" containerName="registry-server" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.039645 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="61645953-0502-4766-aa0f-c1a7a97c9258" containerName="registry-server" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.039749 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="61645953-0502-4766-aa0f-c1a7a97c9258" containerName="registry-server" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.046504 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp"] Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.046620 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.054927 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-d4f98" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.072672 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-bundle\") pod \"f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.072725 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-util\") pod \"f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.072808 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmgp2\" (UniqueName: \"kubernetes.io/projected/6e466ee1-8f55-4662-932a-fb92d7d03f5e-kube-api-access-hmgp2\") pod \"f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.173619 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-bundle\") pod \"f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.173663 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-util\") pod \"f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.173702 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmgp2\" (UniqueName: \"kubernetes.io/projected/6e466ee1-8f55-4662-932a-fb92d7d03f5e-kube-api-access-hmgp2\") pod \"f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.174328 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-bundle\") pod \"f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.174730 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-util\") pod \"f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.199715 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmgp2\" (UniqueName: \"kubernetes.io/projected/6e466ee1-8f55-4662-932a-fb92d7d03f5e-kube-api-access-hmgp2\") pod \"f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.364177 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:16 crc kubenswrapper[4728]: I1216 15:10:16.558914 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp"] Dec 16 15:10:17 crc kubenswrapper[4728]: I1216 15:10:17.134143 4728 generic.go:334] "Generic (PLEG): container finished" podID="6e466ee1-8f55-4662-932a-fb92d7d03f5e" containerID="cc5a4c6ef50fc85b83b2d30726f97191bddaa4235ae26f9090c987c95564c224" exitCode=0 Dec 16 15:10:17 crc kubenswrapper[4728]: I1216 15:10:17.134236 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" event={"ID":"6e466ee1-8f55-4662-932a-fb92d7d03f5e","Type":"ContainerDied","Data":"cc5a4c6ef50fc85b83b2d30726f97191bddaa4235ae26f9090c987c95564c224"} Dec 16 15:10:17 crc kubenswrapper[4728]: I1216 15:10:17.134732 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" event={"ID":"6e466ee1-8f55-4662-932a-fb92d7d03f5e","Type":"ContainerStarted","Data":"db81b0535d592438049afdf2048443b1cb451ad96512d8f3a1b9148c60fb5665"} Dec 16 15:10:18 crc kubenswrapper[4728]: I1216 15:10:18.145141 4728 generic.go:334] "Generic (PLEG): container finished" podID="6e466ee1-8f55-4662-932a-fb92d7d03f5e" containerID="7925b09d99851fcb6babb948d1a7ee904cbf28700c8ad0e3964ce250e9eabf00" exitCode=0 Dec 16 15:10:18 crc kubenswrapper[4728]: I1216 15:10:18.145214 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" event={"ID":"6e466ee1-8f55-4662-932a-fb92d7d03f5e","Type":"ContainerDied","Data":"7925b09d99851fcb6babb948d1a7ee904cbf28700c8ad0e3964ce250e9eabf00"} Dec 16 15:10:19 crc kubenswrapper[4728]: I1216 15:10:19.156063 4728 generic.go:334] "Generic (PLEG): container finished" podID="6e466ee1-8f55-4662-932a-fb92d7d03f5e" containerID="a069d3951662685c54ef6bd79e05d036d65a31f30fec83f4a0f21f6ae6bc75f8" exitCode=0 Dec 16 15:10:19 crc kubenswrapper[4728]: I1216 15:10:19.156115 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" event={"ID":"6e466ee1-8f55-4662-932a-fb92d7d03f5e","Type":"ContainerDied","Data":"a069d3951662685c54ef6bd79e05d036d65a31f30fec83f4a0f21f6ae6bc75f8"} Dec 16 15:10:20 crc kubenswrapper[4728]: I1216 15:10:20.530226 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:20 crc kubenswrapper[4728]: I1216 15:10:20.642308 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-util\") pod \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " Dec 16 15:10:20 crc kubenswrapper[4728]: I1216 15:10:20.642430 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-bundle\") pod \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " Dec 16 15:10:20 crc kubenswrapper[4728]: I1216 15:10:20.642478 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmgp2\" (UniqueName: \"kubernetes.io/projected/6e466ee1-8f55-4662-932a-fb92d7d03f5e-kube-api-access-hmgp2\") pod \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\" (UID: \"6e466ee1-8f55-4662-932a-fb92d7d03f5e\") " Dec 16 15:10:20 crc kubenswrapper[4728]: I1216 15:10:20.643397 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-bundle" (OuterVolumeSpecName: "bundle") pod "6e466ee1-8f55-4662-932a-fb92d7d03f5e" (UID: "6e466ee1-8f55-4662-932a-fb92d7d03f5e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:10:20 crc kubenswrapper[4728]: I1216 15:10:20.643685 4728 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:20 crc kubenswrapper[4728]: I1216 15:10:20.650526 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e466ee1-8f55-4662-932a-fb92d7d03f5e-kube-api-access-hmgp2" (OuterVolumeSpecName: "kube-api-access-hmgp2") pod "6e466ee1-8f55-4662-932a-fb92d7d03f5e" (UID: "6e466ee1-8f55-4662-932a-fb92d7d03f5e"). InnerVolumeSpecName "kube-api-access-hmgp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:10:20 crc kubenswrapper[4728]: I1216 15:10:20.672868 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-util" (OuterVolumeSpecName: "util") pod "6e466ee1-8f55-4662-932a-fb92d7d03f5e" (UID: "6e466ee1-8f55-4662-932a-fb92d7d03f5e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:10:20 crc kubenswrapper[4728]: I1216 15:10:20.744726 4728 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e466ee1-8f55-4662-932a-fb92d7d03f5e-util\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:20 crc kubenswrapper[4728]: I1216 15:10:20.744796 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmgp2\" (UniqueName: \"kubernetes.io/projected/6e466ee1-8f55-4662-932a-fb92d7d03f5e-kube-api-access-hmgp2\") on node \"crc\" DevicePath \"\"" Dec 16 15:10:21 crc kubenswrapper[4728]: I1216 15:10:21.174916 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" event={"ID":"6e466ee1-8f55-4662-932a-fb92d7d03f5e","Type":"ContainerDied","Data":"db81b0535d592438049afdf2048443b1cb451ad96512d8f3a1b9148c60fb5665"} Dec 16 15:10:21 crc kubenswrapper[4728]: I1216 15:10:21.175310 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db81b0535d592438049afdf2048443b1cb451ad96512d8f3a1b9148c60fb5665" Dec 16 15:10:21 crc kubenswrapper[4728]: I1216 15:10:21.175221 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.066979 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd"] Dec 16 15:10:28 crc kubenswrapper[4728]: E1216 15:10:28.067881 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e466ee1-8f55-4662-932a-fb92d7d03f5e" containerName="extract" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.067897 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e466ee1-8f55-4662-932a-fb92d7d03f5e" containerName="extract" Dec 16 15:10:28 crc kubenswrapper[4728]: E1216 15:10:28.067925 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e466ee1-8f55-4662-932a-fb92d7d03f5e" containerName="util" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.067933 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e466ee1-8f55-4662-932a-fb92d7d03f5e" containerName="util" Dec 16 15:10:28 crc kubenswrapper[4728]: E1216 15:10:28.067953 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e466ee1-8f55-4662-932a-fb92d7d03f5e" containerName="pull" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.067963 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e466ee1-8f55-4662-932a-fb92d7d03f5e" containerName="pull" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.068096 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e466ee1-8f55-4662-932a-fb92d7d03f5e" containerName="extract" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.068650 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.070628 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-6vkdm" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.093561 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd"] Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.257296 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grvfd\" (UniqueName: \"kubernetes.io/projected/d6a45f52-4776-491e-a850-afe8d2efa914-kube-api-access-grvfd\") pod \"openstack-operator-controller-operator-6bdf96f7b8-fqbkd\" (UID: \"d6a45f52-4776-491e-a850-afe8d2efa914\") " pod="openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.357950 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grvfd\" (UniqueName: \"kubernetes.io/projected/d6a45f52-4776-491e-a850-afe8d2efa914-kube-api-access-grvfd\") pod \"openstack-operator-controller-operator-6bdf96f7b8-fqbkd\" (UID: \"d6a45f52-4776-491e-a850-afe8d2efa914\") " pod="openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.381439 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grvfd\" (UniqueName: \"kubernetes.io/projected/d6a45f52-4776-491e-a850-afe8d2efa914-kube-api-access-grvfd\") pod \"openstack-operator-controller-operator-6bdf96f7b8-fqbkd\" (UID: \"d6a45f52-4776-491e-a850-afe8d2efa914\") " pod="openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.387363 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd" Dec 16 15:10:28 crc kubenswrapper[4728]: I1216 15:10:28.628934 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd"] Dec 16 15:10:29 crc kubenswrapper[4728]: I1216 15:10:29.233386 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd" event={"ID":"d6a45f52-4776-491e-a850-afe8d2efa914","Type":"ContainerStarted","Data":"8ba5a6ee07127c4d453202c974d8d97bde4f9a247f4fc60bb75c07a842e3c9c9"} Dec 16 15:10:34 crc kubenswrapper[4728]: I1216 15:10:34.267593 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd" event={"ID":"d6a45f52-4776-491e-a850-afe8d2efa914","Type":"ContainerStarted","Data":"b04398262d595729b7613a22da67ee8a73a87ce1879e16560ca105bb5ec06bc5"} Dec 16 15:10:34 crc kubenswrapper[4728]: I1216 15:10:34.269048 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd" Dec 16 15:10:34 crc kubenswrapper[4728]: I1216 15:10:34.295556 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd" podStartSLOduration=1.923280444 podStartE2EDuration="6.295527991s" podCreationTimestamp="2025-12-16 15:10:28 +0000 UTC" firstStartedPulling="2025-12-16 15:10:28.641804377 +0000 UTC m=+809.481983361" lastFinishedPulling="2025-12-16 15:10:33.014051904 +0000 UTC m=+813.854230908" observedRunningTime="2025-12-16 15:10:34.292679843 +0000 UTC m=+815.132858847" watchObservedRunningTime="2025-12-16 15:10:34.295527991 +0000 UTC m=+815.135706965" Dec 16 15:10:38 crc kubenswrapper[4728]: I1216 15:10:38.391502 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6bdf96f7b8-fqbkd" Dec 16 15:10:38 crc kubenswrapper[4728]: I1216 15:10:38.818329 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:10:38 crc kubenswrapper[4728]: I1216 15:10:38.818735 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:11:08 crc kubenswrapper[4728]: I1216 15:11:08.818797 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:11:08 crc kubenswrapper[4728]: I1216 15:11:08.819331 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:11:08 crc kubenswrapper[4728]: I1216 15:11:08.819392 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:11:08 crc kubenswrapper[4728]: I1216 15:11:08.820193 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0cc664ff3879b159126f992f52a6c4ccf1fc8c0903483566c983c5026f497d68"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:11:08 crc kubenswrapper[4728]: I1216 15:11:08.820273 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://0cc664ff3879b159126f992f52a6c4ccf1fc8c0903483566c983c5026f497d68" gracePeriod=600 Dec 16 15:11:10 crc kubenswrapper[4728]: I1216 15:11:10.522614 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="0cc664ff3879b159126f992f52a6c4ccf1fc8c0903483566c983c5026f497d68" exitCode=0 Dec 16 15:11:10 crc kubenswrapper[4728]: I1216 15:11:10.522799 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"0cc664ff3879b159126f992f52a6c4ccf1fc8c0903483566c983c5026f497d68"} Dec 16 15:11:10 crc kubenswrapper[4728]: I1216 15:11:10.523269 4728 scope.go:117] "RemoveContainer" containerID="5556e0d6dfe6e1666b1eb820e6992928174cc0e89be80318dfc33d104f059a37" Dec 16 15:11:10 crc kubenswrapper[4728]: I1216 15:11:10.523120 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"5f528a37171fd283501ab52158c0534c2dc70337f5ffb233b47cd1885a45c673"} Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.136923 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-l6vt8"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.140050 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.146281 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-7t2gr" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.152402 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.153652 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.162352 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-d76rn" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.166197 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-l6vt8"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.170348 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.171285 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.175353 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.175922 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gdnkz" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.176062 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.201518 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-29zhw" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.202463 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.232849 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.279542 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.289986 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjs4v\" (UniqueName: \"kubernetes.io/projected/84531a1b-f019-449d-8779-05b03bde07cb-kube-api-access-hjs4v\") pod \"designate-operator-controller-manager-66f8b87655-qtz4v\" (UID: \"84531a1b-f019-449d-8779-05b03bde07cb\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.290035 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtfnn\" (UniqueName: \"kubernetes.io/projected/f5364dc6-650d-427d-aab6-c50ba3d69b75-kube-api-access-xtfnn\") pod \"glance-operator-controller-manager-767f9d7567-wn6qf\" (UID: \"f5364dc6-650d-427d-aab6-c50ba3d69b75\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.290076 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2hkt\" (UniqueName: \"kubernetes.io/projected/dbf95255-3fe3-4421-be60-212514fef21c-kube-api-access-x2hkt\") pod \"cinder-operator-controller-manager-5f98b4754f-6sdq7\" (UID: \"dbf95255-3fe3-4421-be60-212514fef21c\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.290094 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wkv7\" (UniqueName: \"kubernetes.io/projected/12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c-kube-api-access-8wkv7\") pod \"barbican-operator-controller-manager-95949466-l6vt8\" (UID: \"12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.304220 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.313076 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.317834 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rw5pr" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.390123 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.392299 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjs4v\" (UniqueName: \"kubernetes.io/projected/84531a1b-f019-449d-8779-05b03bde07cb-kube-api-access-hjs4v\") pod \"designate-operator-controller-manager-66f8b87655-qtz4v\" (UID: \"84531a1b-f019-449d-8779-05b03bde07cb\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.392500 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtfnn\" (UniqueName: \"kubernetes.io/projected/f5364dc6-650d-427d-aab6-c50ba3d69b75-kube-api-access-xtfnn\") pod \"glance-operator-controller-manager-767f9d7567-wn6qf\" (UID: \"f5364dc6-650d-427d-aab6-c50ba3d69b75\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.392646 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2hkt\" (UniqueName: \"kubernetes.io/projected/dbf95255-3fe3-4421-be60-212514fef21c-kube-api-access-x2hkt\") pod \"cinder-operator-controller-manager-5f98b4754f-6sdq7\" (UID: \"dbf95255-3fe3-4421-be60-212514fef21c\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.392760 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wkv7\" (UniqueName: \"kubernetes.io/projected/12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c-kube-api-access-8wkv7\") pod \"barbican-operator-controller-manager-95949466-l6vt8\" (UID: \"12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.413873 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.414775 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.431913 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-5xs88" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.439458 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.442856 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtfnn\" (UniqueName: \"kubernetes.io/projected/f5364dc6-650d-427d-aab6-c50ba3d69b75-kube-api-access-xtfnn\") pod \"glance-operator-controller-manager-767f9d7567-wn6qf\" (UID: \"f5364dc6-650d-427d-aab6-c50ba3d69b75\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.443766 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.444176 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2hkt\" (UniqueName: \"kubernetes.io/projected/dbf95255-3fe3-4421-be60-212514fef21c-kube-api-access-x2hkt\") pod \"cinder-operator-controller-manager-5f98b4754f-6sdq7\" (UID: \"dbf95255-3fe3-4421-be60-212514fef21c\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.444197 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjs4v\" (UniqueName: \"kubernetes.io/projected/84531a1b-f019-449d-8779-05b03bde07cb-kube-api-access-hjs4v\") pod \"designate-operator-controller-manager-66f8b87655-qtz4v\" (UID: \"84531a1b-f019-449d-8779-05b03bde07cb\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.444208 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wkv7\" (UniqueName: \"kubernetes.io/projected/12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c-kube-api-access-8wkv7\") pod \"barbican-operator-controller-manager-95949466-l6vt8\" (UID: \"12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.445776 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.452878 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.457728 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.458709 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.462831 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9r88b" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.463063 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.467555 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-8qchf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.467732 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.468601 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.471206 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.482467 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.482886 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.483690 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.487267 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-nvrvr" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.487538 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-bjb58" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.487966 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.488853 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.491726 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.492730 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.493743 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mltq7\" (UniqueName: \"kubernetes.io/projected/90e228b6-e35d-4ee2-992c-364b4abd8436-kube-api-access-mltq7\") pod \"heat-operator-controller-manager-59b8dcb766-qpsk9\" (UID: \"90e228b6-e35d-4ee2-992c-364b4abd8436\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.498470 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.501435 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rf24x" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.501753 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-nkr9z" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.503749 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.521088 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.526471 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.526500 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.528546 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.533012 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-hr8tn" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.534765 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.544691 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.571477 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.572773 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.577943 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.581982 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.588547 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.589305 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.592843 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-l4ss2" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.599739 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-s5rmf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.600286 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7cld\" (UniqueName: \"kubernetes.io/projected/a8ceccb7-c74c-42c4-a763-d947892f942d-kube-api-access-h7cld\") pod \"infra-operator-controller-manager-84b495f78-ljkxp\" (UID: \"a8ceccb7-c74c-42c4-a763-d947892f942d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.600321 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmmfs\" (UniqueName: \"kubernetes.io/projected/fe17017f-5157-4d72-bb40-58a456517c3e-kube-api-access-bmmfs\") pod \"neutron-operator-controller-manager-7cd87b778f-xmv9j\" (UID: \"fe17017f-5157-4d72-bb40-58a456517c3e\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.600363 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgh4v\" (UniqueName: \"kubernetes.io/projected/0cc3d254-9633-4e63-91a8-719af70696f6-kube-api-access-cgh4v\") pod \"keystone-operator-controller-manager-5c7cbf548f-mns5x\" (UID: \"0cc3d254-9633-4e63-91a8-719af70696f6\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.600386 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njwff\" (UniqueName: \"kubernetes.io/projected/0252b186-dc46-4cca-ba92-9855cb2aa4ec-kube-api-access-njwff\") pod \"horizon-operator-controller-manager-6ccf486b9-hcdxf\" (UID: \"0252b186-dc46-4cca-ba92-9855cb2aa4ec\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.600425 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k79rt\" (UniqueName: \"kubernetes.io/projected/660d7a4f-e56a-42c8-8db6-d1f7285d7d04-kube-api-access-k79rt\") pod \"ironic-operator-controller-manager-f458558d7-ttkv5\" (UID: \"660d7a4f-e56a-42c8-8db6-d1f7285d7d04\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.600450 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9rs9\" (UniqueName: \"kubernetes.io/projected/89d4ec07-baef-4061-b6d8-e50f3ab47bb1-kube-api-access-s9rs9\") pod \"mariadb-operator-controller-manager-f76f4954c-t6vdg\" (UID: \"89d4ec07-baef-4061-b6d8-e50f3ab47bb1\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.600468 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcxcm\" (UniqueName: \"kubernetes.io/projected/59a84980-fdf4-4ff3-b8c7-464e1423bad3-kube-api-access-hcxcm\") pod \"manila-operator-controller-manager-5fdd9786f7-mfc2h\" (UID: \"59a84980-fdf4-4ff3-b8c7-464e1423bad3\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.600509 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mltq7\" (UniqueName: \"kubernetes.io/projected/90e228b6-e35d-4ee2-992c-364b4abd8436-kube-api-access-mltq7\") pod \"heat-operator-controller-manager-59b8dcb766-qpsk9\" (UID: \"90e228b6-e35d-4ee2-992c-364b4abd8436\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.600530 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert\") pod \"infra-operator-controller-manager-84b495f78-ljkxp\" (UID: \"a8ceccb7-c74c-42c4-a763-d947892f942d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.623198 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.626124 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.631009 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.635194 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2nphf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.650728 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mltq7\" (UniqueName: \"kubernetes.io/projected/90e228b6-e35d-4ee2-992c-364b4abd8436-kube-api-access-mltq7\") pod \"heat-operator-controller-manager-59b8dcb766-qpsk9\" (UID: \"90e228b6-e35d-4ee2-992c-364b4abd8436\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.660007 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.691507 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702008 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgh4v\" (UniqueName: \"kubernetes.io/projected/0cc3d254-9633-4e63-91a8-719af70696f6-kube-api-access-cgh4v\") pod \"keystone-operator-controller-manager-5c7cbf548f-mns5x\" (UID: \"0cc3d254-9633-4e63-91a8-719af70696f6\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702059 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njwff\" (UniqueName: \"kubernetes.io/projected/0252b186-dc46-4cca-ba92-9855cb2aa4ec-kube-api-access-njwff\") pod \"horizon-operator-controller-manager-6ccf486b9-hcdxf\" (UID: \"0252b186-dc46-4cca-ba92-9855cb2aa4ec\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702080 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t46x4\" (UniqueName: \"kubernetes.io/projected/e501f8ed-3791-4661-8c3e-bfb4eaeeb64d-kube-api-access-t46x4\") pod \"octavia-operator-controller-manager-68c649d9d-j7jxc\" (UID: \"e501f8ed-3791-4661-8c3e-bfb4eaeeb64d\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702096 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k79rt\" (UniqueName: \"kubernetes.io/projected/660d7a4f-e56a-42c8-8db6-d1f7285d7d04-kube-api-access-k79rt\") pod \"ironic-operator-controller-manager-f458558d7-ttkv5\" (UID: \"660d7a4f-e56a-42c8-8db6-d1f7285d7d04\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702114 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8z8f\" (UniqueName: \"kubernetes.io/projected/6b5beb20-1139-4774-8ea6-b5c951a6cbba-kube-api-access-p8z8f\") pod \"nova-operator-controller-manager-5fbbf8b6cc-4jbw4\" (UID: \"6b5beb20-1139-4774-8ea6-b5c951a6cbba\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702132 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9rs9\" (UniqueName: \"kubernetes.io/projected/89d4ec07-baef-4061-b6d8-e50f3ab47bb1-kube-api-access-s9rs9\") pod \"mariadb-operator-controller-manager-f76f4954c-t6vdg\" (UID: \"89d4ec07-baef-4061-b6d8-e50f3ab47bb1\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702149 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcxcm\" (UniqueName: \"kubernetes.io/projected/59a84980-fdf4-4ff3-b8c7-464e1423bad3-kube-api-access-hcxcm\") pod \"manila-operator-controller-manager-5fdd9786f7-mfc2h\" (UID: \"59a84980-fdf4-4ff3-b8c7-464e1423bad3\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702185 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert\") pod \"infra-operator-controller-manager-84b495f78-ljkxp\" (UID: \"a8ceccb7-c74c-42c4-a763-d947892f942d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702217 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7cld\" (UniqueName: \"kubernetes.io/projected/a8ceccb7-c74c-42c4-a763-d947892f942d-kube-api-access-h7cld\") pod \"infra-operator-controller-manager-84b495f78-ljkxp\" (UID: \"a8ceccb7-c74c-42c4-a763-d947892f942d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702233 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl\" (UID: \"a4b04d21-7de1-4565-99e6-fbeb59a0fde6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702256 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5mzg\" (UniqueName: \"kubernetes.io/projected/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-kube-api-access-n5mzg\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl\" (UID: \"a4b04d21-7de1-4565-99e6-fbeb59a0fde6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.702274 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmmfs\" (UniqueName: \"kubernetes.io/projected/fe17017f-5157-4d72-bb40-58a456517c3e-kube-api-access-bmmfs\") pod \"neutron-operator-controller-manager-7cd87b778f-xmv9j\" (UID: \"fe17017f-5157-4d72-bb40-58a456517c3e\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" Dec 16 15:11:15 crc kubenswrapper[4728]: E1216 15:11:15.703049 4728 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 15:11:15 crc kubenswrapper[4728]: E1216 15:11:15.703084 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert podName:a8ceccb7-c74c-42c4-a763-d947892f942d nodeName:}" failed. No retries permitted until 2025-12-16 15:11:16.203070404 +0000 UTC m=+857.043249388 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert") pod "infra-operator-controller-manager-84b495f78-ljkxp" (UID: "a8ceccb7-c74c-42c4-a763-d947892f942d") : secret "infra-operator-webhook-server-cert" not found Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.706676 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.720378 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.740507 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7cld\" (UniqueName: \"kubernetes.io/projected/a8ceccb7-c74c-42c4-a763-d947892f942d-kube-api-access-h7cld\") pod \"infra-operator-controller-manager-84b495f78-ljkxp\" (UID: \"a8ceccb7-c74c-42c4-a763-d947892f942d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.749489 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.750299 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.752477 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k79rt\" (UniqueName: \"kubernetes.io/projected/660d7a4f-e56a-42c8-8db6-d1f7285d7d04-kube-api-access-k79rt\") pod \"ironic-operator-controller-manager-f458558d7-ttkv5\" (UID: \"660d7a4f-e56a-42c8-8db6-d1f7285d7d04\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.752858 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-n952g" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.754464 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njwff\" (UniqueName: \"kubernetes.io/projected/0252b186-dc46-4cca-ba92-9855cb2aa4ec-kube-api-access-njwff\") pod \"horizon-operator-controller-manager-6ccf486b9-hcdxf\" (UID: \"0252b186-dc46-4cca-ba92-9855cb2aa4ec\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.755303 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcxcm\" (UniqueName: \"kubernetes.io/projected/59a84980-fdf4-4ff3-b8c7-464e1423bad3-kube-api-access-hcxcm\") pod \"manila-operator-controller-manager-5fdd9786f7-mfc2h\" (UID: \"59a84980-fdf4-4ff3-b8c7-464e1423bad3\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.759084 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9rs9\" (UniqueName: \"kubernetes.io/projected/89d4ec07-baef-4061-b6d8-e50f3ab47bb1-kube-api-access-s9rs9\") pod \"mariadb-operator-controller-manager-f76f4954c-t6vdg\" (UID: \"89d4ec07-baef-4061-b6d8-e50f3ab47bb1\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.760651 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmmfs\" (UniqueName: \"kubernetes.io/projected/fe17017f-5157-4d72-bb40-58a456517c3e-kube-api-access-bmmfs\") pod \"neutron-operator-controller-manager-7cd87b778f-xmv9j\" (UID: \"fe17017f-5157-4d72-bb40-58a456517c3e\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.761456 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgh4v\" (UniqueName: \"kubernetes.io/projected/0cc3d254-9633-4e63-91a8-719af70696f6-kube-api-access-cgh4v\") pod \"keystone-operator-controller-manager-5c7cbf548f-mns5x\" (UID: \"0cc3d254-9633-4e63-91a8-719af70696f6\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.802754 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.807875 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5mzg\" (UniqueName: \"kubernetes.io/projected/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-kube-api-access-n5mzg\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl\" (UID: \"a4b04d21-7de1-4565-99e6-fbeb59a0fde6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.808044 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t46x4\" (UniqueName: \"kubernetes.io/projected/e501f8ed-3791-4661-8c3e-bfb4eaeeb64d-kube-api-access-t46x4\") pod \"octavia-operator-controller-manager-68c649d9d-j7jxc\" (UID: \"e501f8ed-3791-4661-8c3e-bfb4eaeeb64d\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.808089 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8z8f\" (UniqueName: \"kubernetes.io/projected/6b5beb20-1139-4774-8ea6-b5c951a6cbba-kube-api-access-p8z8f\") pod \"nova-operator-controller-manager-5fbbf8b6cc-4jbw4\" (UID: \"6b5beb20-1139-4774-8ea6-b5c951a6cbba\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.808149 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46fkx\" (UniqueName: \"kubernetes.io/projected/7a8c4b97-2de8-4235-aa76-c8382c5c5cb1-kube-api-access-46fkx\") pod \"ovn-operator-controller-manager-bf6d4f946-68vvq\" (UID: \"7a8c4b97-2de8-4235-aa76-c8382c5c5cb1\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.808256 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl\" (UID: \"a4b04d21-7de1-4565-99e6-fbeb59a0fde6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:15 crc kubenswrapper[4728]: E1216 15:11:15.808956 4728 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:11:15 crc kubenswrapper[4728]: E1216 15:11:15.809174 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert podName:a4b04d21-7de1-4565-99e6-fbeb59a0fde6 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:16.309006588 +0000 UTC m=+857.149185572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" (UID: "a4b04d21-7de1-4565-99e6-fbeb59a0fde6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.817059 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.820063 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.826392 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-2295x" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.830013 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.832033 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8z8f\" (UniqueName: \"kubernetes.io/projected/6b5beb20-1139-4774-8ea6-b5c951a6cbba-kube-api-access-p8z8f\") pod \"nova-operator-controller-manager-5fbbf8b6cc-4jbw4\" (UID: \"6b5beb20-1139-4774-8ea6-b5c951a6cbba\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.832259 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5mzg\" (UniqueName: \"kubernetes.io/projected/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-kube-api-access-n5mzg\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl\" (UID: \"a4b04d21-7de1-4565-99e6-fbeb59a0fde6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.837513 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.846564 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.855416 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.856248 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.863852 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-sc5hb" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.867954 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.868778 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t46x4\" (UniqueName: \"kubernetes.io/projected/e501f8ed-3791-4661-8c3e-bfb4eaeeb64d-kube-api-access-t46x4\") pod \"octavia-operator-controller-manager-68c649d9d-j7jxc\" (UID: \"e501f8ed-3791-4661-8c3e-bfb4eaeeb64d\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.881455 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.885965 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.890755 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.891560 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-95txm" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.896106 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.901151 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.909779 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.910761 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-862xp\" (UniqueName: \"kubernetes.io/projected/75c9a0f4-94bc-4bf5-b164-149256d1a214-kube-api-access-862xp\") pod \"placement-operator-controller-manager-8665b56d78-zcz8p\" (UID: \"75c9a0f4-94bc-4bf5-b164-149256d1a214\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.910824 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lgmx\" (UniqueName: \"kubernetes.io/projected/9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4-kube-api-access-7lgmx\") pod \"swift-operator-controller-manager-5c6df8f9-xvvjw\" (UID: \"9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.910855 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46fkx\" (UniqueName: \"kubernetes.io/projected/7a8c4b97-2de8-4235-aa76-c8382c5c5cb1-kube-api-access-46fkx\") pod \"ovn-operator-controller-manager-bf6d4f946-68vvq\" (UID: \"7a8c4b97-2de8-4235-aa76-c8382c5c5cb1\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.922752 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.923843 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.927793 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.932821 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-wkhbz" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.939495 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46fkx\" (UniqueName: \"kubernetes.io/projected/7a8c4b97-2de8-4235-aa76-c8382c5c5cb1-kube-api-access-46fkx\") pod \"ovn-operator-controller-manager-bf6d4f946-68vvq\" (UID: \"7a8c4b97-2de8-4235-aa76-c8382c5c5cb1\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.943163 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.958574 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8"] Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.967362 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.986633 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq" Dec 16 15:11:15 crc kubenswrapper[4728]: I1216 15:11:15.990696 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9"] Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.004137 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.007550 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9"] Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.009045 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.009735 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.011140 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vn5zk" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.011720 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-862xp\" (UniqueName: \"kubernetes.io/projected/75c9a0f4-94bc-4bf5-b164-149256d1a214-kube-api-access-862xp\") pod \"placement-operator-controller-manager-8665b56d78-zcz8p\" (UID: \"75c9a0f4-94bc-4bf5-b164-149256d1a214\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.011773 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lgmx\" (UniqueName: \"kubernetes.io/projected/9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4-kube-api-access-7lgmx\") pod \"swift-operator-controller-manager-5c6df8f9-xvvjw\" (UID: \"9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.011828 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttjs4\" (UniqueName: \"kubernetes.io/projected/66c25f6d-85c4-4e3e-bf44-93499cc2321c-kube-api-access-ttjs4\") pod \"watcher-operator-controller-manager-55f78b7c4c-dz6x8\" (UID: \"66c25f6d-85c4-4e3e-bf44-93499cc2321c\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.011889 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm69f\" (UniqueName: \"kubernetes.io/projected/f155db6c-255a-4401-884a-b48825bb93c7-kube-api-access-pm69f\") pod \"telemetry-operator-controller-manager-97d456b9-n6x46\" (UID: \"f155db6c-255a-4401-884a-b48825bb93c7\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.011948 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrp7r\" (UniqueName: \"kubernetes.io/projected/8324ae5e-23f8-4267-9822-a4ae37c7cd5a-kube-api-access-rrp7r\") pod \"test-operator-controller-manager-756ccf86c7-9p2mz\" (UID: \"8324ae5e-23f8-4267-9822-a4ae37c7cd5a\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.047269 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lgmx\" (UniqueName: \"kubernetes.io/projected/9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4-kube-api-access-7lgmx\") pod \"swift-operator-controller-manager-5c6df8f9-xvvjw\" (UID: \"9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.057036 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp"] Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.058334 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.060740 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp"] Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.062190 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-chblh" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.063266 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-862xp\" (UniqueName: \"kubernetes.io/projected/75c9a0f4-94bc-4bf5-b164-149256d1a214-kube-api-access-862xp\") pod \"placement-operator-controller-manager-8665b56d78-zcz8p\" (UID: \"75c9a0f4-94bc-4bf5-b164-149256d1a214\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.112673 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.112724 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrp7r\" (UniqueName: \"kubernetes.io/projected/8324ae5e-23f8-4267-9822-a4ae37c7cd5a-kube-api-access-rrp7r\") pod \"test-operator-controller-manager-756ccf86c7-9p2mz\" (UID: \"8324ae5e-23f8-4267-9822-a4ae37c7cd5a\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.112780 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttjs4\" (UniqueName: \"kubernetes.io/projected/66c25f6d-85c4-4e3e-bf44-93499cc2321c-kube-api-access-ttjs4\") pod \"watcher-operator-controller-manager-55f78b7c4c-dz6x8\" (UID: \"66c25f6d-85c4-4e3e-bf44-93499cc2321c\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.112828 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll5hx\" (UniqueName: \"kubernetes.io/projected/0def48bf-646d-4641-93b5-a9e4e058cc67-kube-api-access-ll5hx\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.112853 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm69f\" (UniqueName: \"kubernetes.io/projected/f155db6c-255a-4401-884a-b48825bb93c7-kube-api-access-pm69f\") pod \"telemetry-operator-controller-manager-97d456b9-n6x46\" (UID: \"f155db6c-255a-4401-884a-b48825bb93c7\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.112890 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.113919 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.129956 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttjs4\" (UniqueName: \"kubernetes.io/projected/66c25f6d-85c4-4e3e-bf44-93499cc2321c-kube-api-access-ttjs4\") pod \"watcher-operator-controller-manager-55f78b7c4c-dz6x8\" (UID: \"66c25f6d-85c4-4e3e-bf44-93499cc2321c\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.134208 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm69f\" (UniqueName: \"kubernetes.io/projected/f155db6c-255a-4401-884a-b48825bb93c7-kube-api-access-pm69f\") pod \"telemetry-operator-controller-manager-97d456b9-n6x46\" (UID: \"f155db6c-255a-4401-884a-b48825bb93c7\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.138818 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrp7r\" (UniqueName: \"kubernetes.io/projected/8324ae5e-23f8-4267-9822-a4ae37c7cd5a-kube-api-access-rrp7r\") pod \"test-operator-controller-manager-756ccf86c7-9p2mz\" (UID: \"8324ae5e-23f8-4267-9822-a4ae37c7cd5a\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.170081 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.205620 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.208416 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-l6vt8"] Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.214274 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.214324 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq4wb\" (UniqueName: \"kubernetes.io/projected/160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e-kube-api-access-gq4wb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-f9pgp\" (UID: \"160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.214367 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.214456 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert\") pod \"infra-operator-controller-manager-84b495f78-ljkxp\" (UID: \"a8ceccb7-c74c-42c4-a763-d947892f942d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.214475 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll5hx\" (UniqueName: \"kubernetes.io/projected/0def48bf-646d-4641-93b5-a9e4e058cc67-kube-api-access-ll5hx\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.214897 4728 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.214913 4728 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.214964 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs podName:0def48bf-646d-4641-93b5-a9e4e058cc67 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:16.714936464 +0000 UTC m=+857.555115448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs") pod "openstack-operator-controller-manager-757cf4457b-v8kt9" (UID: "0def48bf-646d-4641-93b5-a9e4e058cc67") : secret "webhook-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.214980 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs podName:0def48bf-646d-4641-93b5-a9e4e058cc67 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:16.714973605 +0000 UTC m=+857.555152589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs") pod "openstack-operator-controller-manager-757cf4457b-v8kt9" (UID: "0def48bf-646d-4641-93b5-a9e4e058cc67") : secret "metrics-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.215035 4728 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.215080 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert podName:a8ceccb7-c74c-42c4-a763-d947892f942d nodeName:}" failed. No retries permitted until 2025-12-16 15:11:17.215065107 +0000 UTC m=+858.055244091 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert") pod "infra-operator-controller-manager-84b495f78-ljkxp" (UID: "a8ceccb7-c74c-42c4-a763-d947892f942d") : secret "infra-operator-webhook-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.229783 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.240583 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll5hx\" (UniqueName: \"kubernetes.io/projected/0def48bf-646d-4641-93b5-a9e4e058cc67-kube-api-access-ll5hx\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:16 crc kubenswrapper[4728]: W1216 15:11:16.282799 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12db8b96_5f9f_4d46_9dbe_71b1e1d5c82c.slice/crio-2db4544ab3bb08bc375c5669c020e09d43118bcddf0a0c382e198160ecace725 WatchSource:0}: Error finding container 2db4544ab3bb08bc375c5669c020e09d43118bcddf0a0c382e198160ecace725: Status 404 returned error can't find the container with id 2db4544ab3bb08bc375c5669c020e09d43118bcddf0a0c382e198160ecace725 Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.283060 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.316339 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl\" (UID: \"a4b04d21-7de1-4565-99e6-fbeb59a0fde6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.316430 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq4wb\" (UniqueName: \"kubernetes.io/projected/160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e-kube-api-access-gq4wb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-f9pgp\" (UID: \"160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp" Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.316948 4728 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.316998 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert podName:a4b04d21-7de1-4565-99e6-fbeb59a0fde6 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:17.3169843 +0000 UTC m=+858.157163284 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" (UID: "a4b04d21-7de1-4565-99e6-fbeb59a0fde6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.340156 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq4wb\" (UniqueName: \"kubernetes.io/projected/160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e-kube-api-access-gq4wb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-f9pgp\" (UID: \"160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.399694 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.587932 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" event={"ID":"12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c","Type":"ContainerStarted","Data":"2db4544ab3bb08bc375c5669c020e09d43118bcddf0a0c382e198160ecace725"} Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.619558 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9"] Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.627301 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7"] Dec 16 15:11:16 crc kubenswrapper[4728]: W1216 15:11:16.630608 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf95255_3fe3_4421_be60_212514fef21c.slice/crio-b8e6c48beaaaf4696ea5118431d5a4cf9ec25727d94419fb69de464cb6c8423f WatchSource:0}: Error finding container b8e6c48beaaaf4696ea5118431d5a4cf9ec25727d94419fb69de464cb6c8423f: Status 404 returned error can't find the container with id b8e6c48beaaaf4696ea5118431d5a4cf9ec25727d94419fb69de464cb6c8423f Dec 16 15:11:16 crc kubenswrapper[4728]: W1216 15:11:16.637067 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e228b6_e35d_4ee2_992c_364b4abd8436.slice/crio-acb447872e884d9c68bd0092b376acb7ba93f73bf863142823e2c8dae52f2be4 WatchSource:0}: Error finding container acb447872e884d9c68bd0092b376acb7ba93f73bf863142823e2c8dae52f2be4: Status 404 returned error can't find the container with id acb447872e884d9c68bd0092b376acb7ba93f73bf863142823e2c8dae52f2be4 Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.724655 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.724745 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.724876 4728 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.724962 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs podName:0def48bf-646d-4641-93b5-a9e4e058cc67 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:17.724940172 +0000 UTC m=+858.565119166 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs") pod "openstack-operator-controller-manager-757cf4457b-v8kt9" (UID: "0def48bf-646d-4641-93b5-a9e4e058cc67") : secret "webhook-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.724971 4728 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: E1216 15:11:16.725049 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs podName:0def48bf-646d-4641-93b5-a9e4e058cc67 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:17.725031205 +0000 UTC m=+858.565210179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs") pod "openstack-operator-controller-manager-757cf4457b-v8kt9" (UID: "0def48bf-646d-4641-93b5-a9e4e058cc67") : secret "metrics-server-cert" not found Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.789279 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf"] Dec 16 15:11:16 crc kubenswrapper[4728]: W1216 15:11:16.792732 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660d7a4f_e56a_42c8_8db6_d1f7285d7d04.slice/crio-8d79536d355663de5c43f52c1209bccc8055900b090cd445bc3182da93a9cc3a WatchSource:0}: Error finding container 8d79536d355663de5c43f52c1209bccc8055900b090cd445bc3182da93a9cc3a: Status 404 returned error can't find the container with id 8d79536d355663de5c43f52c1209bccc8055900b090cd445bc3182da93a9cc3a Dec 16 15:11:16 crc kubenswrapper[4728]: W1216 15:11:16.797857 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0252b186_dc46_4cca_ba92_9855cb2aa4ec.slice/crio-98a56e565cf4f72205a97b1791b1a7fdc8cb362bcb5b5be1c7236ca2d0a83ad9 WatchSource:0}: Error finding container 98a56e565cf4f72205a97b1791b1a7fdc8cb362bcb5b5be1c7236ca2d0a83ad9: Status 404 returned error can't find the container with id 98a56e565cf4f72205a97b1791b1a7fdc8cb362bcb5b5be1c7236ca2d0a83ad9 Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.801944 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5"] Dec 16 15:11:16 crc kubenswrapper[4728]: W1216 15:11:16.805865 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84531a1b_f019_449d_8779_05b03bde07cb.slice/crio-5b9d43f64c8245bc08239624e8a283833c8266236b398e627d7ee1e0ee4ced68 WatchSource:0}: Error finding container 5b9d43f64c8245bc08239624e8a283833c8266236b398e627d7ee1e0ee4ced68: Status 404 returned error can't find the container with id 5b9d43f64c8245bc08239624e8a283833c8266236b398e627d7ee1e0ee4ced68 Dec 16 15:11:16 crc kubenswrapper[4728]: W1216 15:11:16.811620 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5364dc6_650d_427d_aab6_c50ba3d69b75.slice/crio-57772daa4e859507d9115e1c3bcceb437d2acc1e0f07263cc9ed26462add5a20 WatchSource:0}: Error finding container 57772daa4e859507d9115e1c3bcceb437d2acc1e0f07263cc9ed26462add5a20: Status 404 returned error can't find the container with id 57772daa4e859507d9115e1c3bcceb437d2acc1e0f07263cc9ed26462add5a20 Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.814141 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v"] Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.820674 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf"] Dec 16 15:11:16 crc kubenswrapper[4728]: I1216 15:11:16.825634 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h"] Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.010553 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p"] Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.019475 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc"] Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.039895 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x"] Dec 16 15:11:17 crc kubenswrapper[4728]: W1216 15:11:17.053485 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89d4ec07_baef_4061_b6d8_e50f3ab47bb1.slice/crio-a73481c4edd2ca2b008174284bb3a0a7f3762f9464aec9b47a219e968606b990 WatchSource:0}: Error finding container a73481c4edd2ca2b008174284bb3a0a7f3762f9464aec9b47a219e968606b990: Status 404 returned error can't find the container with id a73481c4edd2ca2b008174284bb3a0a7f3762f9464aec9b47a219e968606b990 Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.059450 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq"] Dec 16 15:11:17 crc kubenswrapper[4728]: W1216 15:11:17.059929 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b5beb20_1139_4774_8ea6_b5c951a6cbba.slice/crio-2c5cd7bd9fea0fe82d5a0259b4d2ba71eee02151f2fe354b32a5d25d780f41c0 WatchSource:0}: Error finding container 2c5cd7bd9fea0fe82d5a0259b4d2ba71eee02151f2fe354b32a5d25d780f41c0: Status 404 returned error can't find the container with id 2c5cd7bd9fea0fe82d5a0259b4d2ba71eee02151f2fe354b32a5d25d780f41c0 Dec 16 15:11:17 crc kubenswrapper[4728]: W1216 15:11:17.061246 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe17017f_5157_4d72_bb40_58a456517c3e.slice/crio-1a79bb6db30e889bf0c239db919288d479665568344545179bc13052e4831888 WatchSource:0}: Error finding container 1a79bb6db30e889bf0c239db919288d479665568344545179bc13052e4831888: Status 404 returned error can't find the container with id 1a79bb6db30e889bf0c239db919288d479665568344545179bc13052e4831888 Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.063550 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p8z8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5fbbf8b6cc-4jbw4_openstack-operators(6b5beb20-1139-4774-8ea6-b5c951a6cbba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.064145 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bmmfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7cd87b778f-xmv9j_openstack-operators(fe17017f-5157-4d72-bb40-58a456517c3e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.064898 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" podUID="6b5beb20-1139-4774-8ea6-b5c951a6cbba" Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.066007 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" podUID="fe17017f-5157-4d72-bb40-58a456517c3e" Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.089848 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4"] Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.094277 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j"] Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.098926 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg"] Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.208456 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz"] Dec 16 15:11:17 crc kubenswrapper[4728]: W1216 15:11:17.215822 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8324ae5e_23f8_4267_9822_a4ae37c7cd5a.slice/crio-28d1e886c0ede6a2c29c9a450566de33ff23e4e38ece4b3cf38edd13c420ee5a WatchSource:0}: Error finding container 28d1e886c0ede6a2c29c9a450566de33ff23e4e38ece4b3cf38edd13c420ee5a: Status 404 returned error can't find the container with id 28d1e886c0ede6a2c29c9a450566de33ff23e4e38ece4b3cf38edd13c420ee5a Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.217488 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rrp7r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-756ccf86c7-9p2mz_openstack-operators(8324ae5e-23f8-4267-9822-a4ae37c7cd5a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.219839 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" podUID="8324ae5e-23f8-4267-9822-a4ae37c7cd5a" Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.233271 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8"] Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.236569 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert\") pod \"infra-operator-controller-manager-84b495f78-ljkxp\" (UID: \"a8ceccb7-c74c-42c4-a763-d947892f942d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.236740 4728 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.236792 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert podName:a8ceccb7-c74c-42c4-a763-d947892f942d nodeName:}" failed. No retries permitted until 2025-12-16 15:11:19.236775452 +0000 UTC m=+860.076954436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert") pod "infra-operator-controller-manager-84b495f78-ljkxp" (UID: "a8ceccb7-c74c-42c4-a763-d947892f942d") : secret "infra-operator-webhook-server-cert" not found Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.241014 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46"] Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.277978 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw"] Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.280433 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pm69f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-97d456b9-n6x46_openstack-operators(f155db6c-255a-4401-884a-b48825bb93c7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.281574 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" podUID="f155db6c-255a-4401-884a-b48825bb93c7" Dec 16 15:11:17 crc kubenswrapper[4728]: W1216 15:11:17.287088 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca3d3f7_7e18_4c96_9071_1cd82d2b2ee4.slice/crio-016923af35c12ddbbecce4988ec411298099c764fc6f4a360bd73298d7294d83 WatchSource:0}: Error finding container 016923af35c12ddbbecce4988ec411298099c764fc6f4a360bd73298d7294d83: Status 404 returned error can't find the container with id 016923af35c12ddbbecce4988ec411298099c764fc6f4a360bd73298d7294d83 Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.290873 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp"] Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.292605 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7lgmx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5c6df8f9-xvvjw_openstack-operators(9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.294044 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" podUID="9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4" Dec 16 15:11:17 crc kubenswrapper[4728]: W1216 15:11:17.305846 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod160c8222_a7a2_4f58_bbe1_c6a5d4b6b38e.slice/crio-9a5a93ebacdc69dbe10312c14963ebff9f90f8bcfb7381f5398a5fc743ce0ab9 WatchSource:0}: Error finding container 9a5a93ebacdc69dbe10312c14963ebff9f90f8bcfb7381f5398a5fc743ce0ab9: Status 404 returned error can't find the container with id 9a5a93ebacdc69dbe10312c14963ebff9f90f8bcfb7381f5398a5fc743ce0ab9 Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.308271 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gq4wb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-f9pgp_openstack-operators(160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.309631 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp" podUID="160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e" Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.337546 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl\" (UID: \"a4b04d21-7de1-4565-99e6-fbeb59a0fde6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.337776 4728 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.337882 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert podName:a4b04d21-7de1-4565-99e6-fbeb59a0fde6 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:19.337865282 +0000 UTC m=+860.178044266 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" (UID: "a4b04d21-7de1-4565-99e6-fbeb59a0fde6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.595558 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8" event={"ID":"66c25f6d-85c4-4e3e-bf44-93499cc2321c","Type":"ContainerStarted","Data":"041a45d38ca1b81088cabbe94c11a5a98ab825981908429c911096fbed7c3873"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.597043 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" event={"ID":"6b5beb20-1139-4774-8ea6-b5c951a6cbba","Type":"ContainerStarted","Data":"2c5cd7bd9fea0fe82d5a0259b4d2ba71eee02151f2fe354b32a5d25d780f41c0"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.598724 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5" event={"ID":"660d7a4f-e56a-42c8-8db6-d1f7285d7d04","Type":"ContainerStarted","Data":"8d79536d355663de5c43f52c1209bccc8055900b090cd445bc3182da93a9cc3a"} Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.599024 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" podUID="6b5beb20-1139-4774-8ea6-b5c951a6cbba" Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.601075 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg" event={"ID":"89d4ec07-baef-4061-b6d8-e50f3ab47bb1","Type":"ContainerStarted","Data":"a73481c4edd2ca2b008174284bb3a0a7f3762f9464aec9b47a219e968606b990"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.602744 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9" event={"ID":"90e228b6-e35d-4ee2-992c-364b4abd8436","Type":"ContainerStarted","Data":"acb447872e884d9c68bd0092b376acb7ba93f73bf863142823e2c8dae52f2be4"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.604416 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf" event={"ID":"0252b186-dc46-4cca-ba92-9855cb2aa4ec","Type":"ContainerStarted","Data":"98a56e565cf4f72205a97b1791b1a7fdc8cb362bcb5b5be1c7236ca2d0a83ad9"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.605506 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p" event={"ID":"75c9a0f4-94bc-4bf5-b164-149256d1a214","Type":"ContainerStarted","Data":"cf67b0d6aff067f01784ac452671b10328724228c703d3473b4adf38dc9a12d4"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.606246 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq" event={"ID":"7a8c4b97-2de8-4235-aa76-c8382c5c5cb1","Type":"ContainerStarted","Data":"fe84d983a73c7c4e30e82663ca56be4cd30c97c67dd9280dabce7bdd161b1d93"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.607434 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp" event={"ID":"160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e","Type":"ContainerStarted","Data":"9a5a93ebacdc69dbe10312c14963ebff9f90f8bcfb7381f5398a5fc743ce0ab9"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.608853 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h" event={"ID":"59a84980-fdf4-4ff3-b8c7-464e1423bad3","Type":"ContainerStarted","Data":"80b947e7a09c9dc8a9c37a7ea841f34b30a3b01d34f9467a644d75480bf06f0b"} Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.609083 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp" podUID="160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e" Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.612054 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7" event={"ID":"dbf95255-3fe3-4421-be60-212514fef21c","Type":"ContainerStarted","Data":"b8e6c48beaaaf4696ea5118431d5a4cf9ec25727d94419fb69de464cb6c8423f"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.614305 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" event={"ID":"9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4","Type":"ContainerStarted","Data":"016923af35c12ddbbecce4988ec411298099c764fc6f4a360bd73298d7294d83"} Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.616984 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" podUID="9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4" Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.617722 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" event={"ID":"f5364dc6-650d-427d-aab6-c50ba3d69b75","Type":"ContainerStarted","Data":"57772daa4e859507d9115e1c3bcceb437d2acc1e0f07263cc9ed26462add5a20"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.618926 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" event={"ID":"84531a1b-f019-449d-8779-05b03bde07cb","Type":"ContainerStarted","Data":"5b9d43f64c8245bc08239624e8a283833c8266236b398e627d7ee1e0ee4ced68"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.620551 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" event={"ID":"f155db6c-255a-4401-884a-b48825bb93c7","Type":"ContainerStarted","Data":"8d0ee455838d5477b13d995aa3d53e7408880582488493068cb5c036dbe84ed2"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.622998 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" event={"ID":"0cc3d254-9633-4e63-91a8-719af70696f6","Type":"ContainerStarted","Data":"aac9a49e3331a464d505d0b507f255a429f0b218f8c3dab48356f327cfedaabc"} Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.623043 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" podUID="f155db6c-255a-4401-884a-b48825bb93c7" Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.624661 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc" event={"ID":"e501f8ed-3791-4661-8c3e-bfb4eaeeb64d","Type":"ContainerStarted","Data":"2739713d3f0694574879b12d93fadb881096dd973cd1bd9b67053d7f4f7e0053"} Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.626001 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" event={"ID":"8324ae5e-23f8-4267-9822-a4ae37c7cd5a","Type":"ContainerStarted","Data":"28d1e886c0ede6a2c29c9a450566de33ff23e4e38ece4b3cf38edd13c420ee5a"} Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.632120 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" podUID="8324ae5e-23f8-4267-9822-a4ae37c7cd5a" Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.634805 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" event={"ID":"fe17017f-5157-4d72-bb40-58a456517c3e","Type":"ContainerStarted","Data":"1a79bb6db30e889bf0c239db919288d479665568344545179bc13052e4831888"} Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.636620 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" podUID="fe17017f-5157-4d72-bb40-58a456517c3e" Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.743074 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:17 crc kubenswrapper[4728]: I1216 15:11:17.743236 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.744145 4728 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.744213 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs podName:0def48bf-646d-4641-93b5-a9e4e058cc67 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:19.744195749 +0000 UTC m=+860.584374733 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs") pod "openstack-operator-controller-manager-757cf4457b-v8kt9" (UID: "0def48bf-646d-4641-93b5-a9e4e058cc67") : secret "metrics-server-cert" not found Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.745556 4728 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 15:11:17 crc kubenswrapper[4728]: E1216 15:11:17.745591 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs podName:0def48bf-646d-4641-93b5-a9e4e058cc67 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:19.745582097 +0000 UTC m=+860.585761071 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs") pod "openstack-operator-controller-manager-757cf4457b-v8kt9" (UID: "0def48bf-646d-4641-93b5-a9e4e058cc67") : secret "webhook-server-cert" not found Dec 16 15:11:18 crc kubenswrapper[4728]: E1216 15:11:18.648546 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp" podUID="160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e" Dec 16 15:11:18 crc kubenswrapper[4728]: E1216 15:11:18.648680 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" podUID="8324ae5e-23f8-4267-9822-a4ae37c7cd5a" Dec 16 15:11:18 crc kubenswrapper[4728]: E1216 15:11:18.648818 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" podUID="fe17017f-5157-4d72-bb40-58a456517c3e" Dec 16 15:11:18 crc kubenswrapper[4728]: E1216 15:11:18.649103 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" podUID="f155db6c-255a-4401-884a-b48825bb93c7" Dec 16 15:11:18 crc kubenswrapper[4728]: E1216 15:11:18.649131 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" podUID="6b5beb20-1139-4774-8ea6-b5c951a6cbba" Dec 16 15:11:18 crc kubenswrapper[4728]: E1216 15:11:18.649703 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" podUID="9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4" Dec 16 15:11:19 crc kubenswrapper[4728]: I1216 15:11:19.265393 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert\") pod \"infra-operator-controller-manager-84b495f78-ljkxp\" (UID: \"a8ceccb7-c74c-42c4-a763-d947892f942d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:19 crc kubenswrapper[4728]: E1216 15:11:19.265688 4728 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 15:11:19 crc kubenswrapper[4728]: E1216 15:11:19.266027 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert podName:a8ceccb7-c74c-42c4-a763-d947892f942d nodeName:}" failed. No retries permitted until 2025-12-16 15:11:23.2660113 +0000 UTC m=+864.106190284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert") pod "infra-operator-controller-manager-84b495f78-ljkxp" (UID: "a8ceccb7-c74c-42c4-a763-d947892f942d") : secret "infra-operator-webhook-server-cert" not found Dec 16 15:11:19 crc kubenswrapper[4728]: I1216 15:11:19.368187 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl\" (UID: \"a4b04d21-7de1-4565-99e6-fbeb59a0fde6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:19 crc kubenswrapper[4728]: E1216 15:11:19.368323 4728 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:11:19 crc kubenswrapper[4728]: E1216 15:11:19.368388 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert podName:a4b04d21-7de1-4565-99e6-fbeb59a0fde6 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:23.368369426 +0000 UTC m=+864.208548410 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" (UID: "a4b04d21-7de1-4565-99e6-fbeb59a0fde6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:11:19 crc kubenswrapper[4728]: I1216 15:11:19.774488 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:19 crc kubenswrapper[4728]: I1216 15:11:19.774726 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:19 crc kubenswrapper[4728]: E1216 15:11:19.774842 4728 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 15:11:19 crc kubenswrapper[4728]: E1216 15:11:19.774904 4728 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 15:11:19 crc kubenswrapper[4728]: E1216 15:11:19.774917 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs podName:0def48bf-646d-4641-93b5-a9e4e058cc67 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:23.774899358 +0000 UTC m=+864.615078342 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs") pod "openstack-operator-controller-manager-757cf4457b-v8kt9" (UID: "0def48bf-646d-4641-93b5-a9e4e058cc67") : secret "metrics-server-cert" not found Dec 16 15:11:19 crc kubenswrapper[4728]: E1216 15:11:19.774984 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs podName:0def48bf-646d-4641-93b5-a9e4e058cc67 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:23.774953569 +0000 UTC m=+864.615132583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs") pod "openstack-operator-controller-manager-757cf4457b-v8kt9" (UID: "0def48bf-646d-4641-93b5-a9e4e058cc67") : secret "webhook-server-cert" not found Dec 16 15:11:23 crc kubenswrapper[4728]: I1216 15:11:23.334562 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert\") pod \"infra-operator-controller-manager-84b495f78-ljkxp\" (UID: \"a8ceccb7-c74c-42c4-a763-d947892f942d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:23 crc kubenswrapper[4728]: E1216 15:11:23.334745 4728 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 16 15:11:23 crc kubenswrapper[4728]: E1216 15:11:23.335365 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert podName:a8ceccb7-c74c-42c4-a763-d947892f942d nodeName:}" failed. No retries permitted until 2025-12-16 15:11:31.335342825 +0000 UTC m=+872.175521819 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert") pod "infra-operator-controller-manager-84b495f78-ljkxp" (UID: "a8ceccb7-c74c-42c4-a763-d947892f942d") : secret "infra-operator-webhook-server-cert" not found Dec 16 15:11:23 crc kubenswrapper[4728]: I1216 15:11:23.436757 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl\" (UID: \"a4b04d21-7de1-4565-99e6-fbeb59a0fde6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:23 crc kubenswrapper[4728]: E1216 15:11:23.436932 4728 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:11:23 crc kubenswrapper[4728]: E1216 15:11:23.437009 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert podName:a4b04d21-7de1-4565-99e6-fbeb59a0fde6 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:31.436986621 +0000 UTC m=+872.277165675 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert") pod "openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" (UID: "a4b04d21-7de1-4565-99e6-fbeb59a0fde6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 16 15:11:23 crc kubenswrapper[4728]: I1216 15:11:23.869076 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:23 crc kubenswrapper[4728]: I1216 15:11:23.869250 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:23 crc kubenswrapper[4728]: E1216 15:11:23.869364 4728 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 16 15:11:23 crc kubenswrapper[4728]: E1216 15:11:23.869423 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs podName:0def48bf-646d-4641-93b5-a9e4e058cc67 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:31.869393595 +0000 UTC m=+872.709572579 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs") pod "openstack-operator-controller-manager-757cf4457b-v8kt9" (UID: "0def48bf-646d-4641-93b5-a9e4e058cc67") : secret "webhook-server-cert" not found Dec 16 15:11:23 crc kubenswrapper[4728]: E1216 15:11:23.869607 4728 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 16 15:11:23 crc kubenswrapper[4728]: E1216 15:11:23.869705 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs podName:0def48bf-646d-4641-93b5-a9e4e058cc67 nodeName:}" failed. No retries permitted until 2025-12-16 15:11:31.869688133 +0000 UTC m=+872.709867117 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs") pod "openstack-operator-controller-manager-757cf4457b-v8kt9" (UID: "0def48bf-646d-4641-93b5-a9e4e058cc67") : secret "metrics-server-cert" not found Dec 16 15:11:28 crc kubenswrapper[4728]: E1216 15:11:28.344386 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 16 15:11:28 crc kubenswrapper[4728]: E1216 15:11:28.345204 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8wkv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-95949466-l6vt8_openstack-operators(12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:11:28 crc kubenswrapper[4728]: E1216 15:11:28.346387 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" podUID="12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c" Dec 16 15:11:28 crc kubenswrapper[4728]: E1216 15:11:28.706041 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" podUID="12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c" Dec 16 15:11:29 crc kubenswrapper[4728]: E1216 15:11:29.074325 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027" Dec 16 15:11:29 crc kubenswrapper[4728]: E1216 15:11:29.074524 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xtfnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-767f9d7567-wn6qf_openstack-operators(f5364dc6-650d-427d-aab6-c50ba3d69b75): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:11:29 crc kubenswrapper[4728]: E1216 15:11:29.075701 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" podUID="f5364dc6-650d-427d-aab6-c50ba3d69b75" Dec 16 15:11:29 crc kubenswrapper[4728]: E1216 15:11:29.524038 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 16 15:11:29 crc kubenswrapper[4728]: E1216 15:11:29.524287 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cgh4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-5c7cbf548f-mns5x_openstack-operators(0cc3d254-9633-4e63-91a8-719af70696f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:11:29 crc kubenswrapper[4728]: E1216 15:11:29.526445 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" podUID="0cc3d254-9633-4e63-91a8-719af70696f6" Dec 16 15:11:29 crc kubenswrapper[4728]: E1216 15:11:29.711672 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" podUID="0cc3d254-9633-4e63-91a8-719af70696f6" Dec 16 15:11:29 crc kubenswrapper[4728]: E1216 15:11:29.711804 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027\\\"\"" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" podUID="f5364dc6-650d-427d-aab6-c50ba3d69b75" Dec 16 15:11:31 crc kubenswrapper[4728]: I1216 15:11:31.392324 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert\") pod \"infra-operator-controller-manager-84b495f78-ljkxp\" (UID: \"a8ceccb7-c74c-42c4-a763-d947892f942d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:31 crc kubenswrapper[4728]: I1216 15:11:31.401607 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8ceccb7-c74c-42c4-a763-d947892f942d-cert\") pod \"infra-operator-controller-manager-84b495f78-ljkxp\" (UID: \"a8ceccb7-c74c-42c4-a763-d947892f942d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:31 crc kubenswrapper[4728]: I1216 15:11:31.494137 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl\" (UID: \"a4b04d21-7de1-4565-99e6-fbeb59a0fde6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:31 crc kubenswrapper[4728]: I1216 15:11:31.499895 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4b04d21-7de1-4565-99e6-fbeb59a0fde6-cert\") pod \"openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl\" (UID: \"a4b04d21-7de1-4565-99e6-fbeb59a0fde6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:31 crc kubenswrapper[4728]: I1216 15:11:31.553464 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:31 crc kubenswrapper[4728]: I1216 15:11:31.701165 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:31 crc kubenswrapper[4728]: I1216 15:11:31.900821 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:31 crc kubenswrapper[4728]: I1216 15:11:31.900963 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:31 crc kubenswrapper[4728]: I1216 15:11:31.907547 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-webhook-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:31 crc kubenswrapper[4728]: I1216 15:11:31.908470 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0def48bf-646d-4641-93b5-a9e4e058cc67-metrics-certs\") pod \"openstack-operator-controller-manager-757cf4457b-v8kt9\" (UID: \"0def48bf-646d-4641-93b5-a9e4e058cc67\") " pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:31 crc kubenswrapper[4728]: I1216 15:11:31.950608 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.213846 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cbdqp"] Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.217493 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.232534 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cbdqp"] Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.239701 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/926ced6a-c5ef-4bef-ac8f-4e24b9a3adff-catalog-content\") pod \"redhat-operators-cbdqp\" (UID: \"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff\") " pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.240549 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpwwp\" (UniqueName: \"kubernetes.io/projected/926ced6a-c5ef-4bef-ac8f-4e24b9a3adff-kube-api-access-mpwwp\") pod \"redhat-operators-cbdqp\" (UID: \"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff\") " pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.240651 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/926ced6a-c5ef-4bef-ac8f-4e24b9a3adff-utilities\") pod \"redhat-operators-cbdqp\" (UID: \"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff\") " pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.341980 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpwwp\" (UniqueName: \"kubernetes.io/projected/926ced6a-c5ef-4bef-ac8f-4e24b9a3adff-kube-api-access-mpwwp\") pod \"redhat-operators-cbdqp\" (UID: \"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff\") " pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.342056 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/926ced6a-c5ef-4bef-ac8f-4e24b9a3adff-utilities\") pod \"redhat-operators-cbdqp\" (UID: \"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff\") " pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.342115 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/926ced6a-c5ef-4bef-ac8f-4e24b9a3adff-catalog-content\") pod \"redhat-operators-cbdqp\" (UID: \"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff\") " pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.342696 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/926ced6a-c5ef-4bef-ac8f-4e24b9a3adff-utilities\") pod \"redhat-operators-cbdqp\" (UID: \"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff\") " pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.342806 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/926ced6a-c5ef-4bef-ac8f-4e24b9a3adff-catalog-content\") pod \"redhat-operators-cbdqp\" (UID: \"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff\") " pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.369897 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpwwp\" (UniqueName: \"kubernetes.io/projected/926ced6a-c5ef-4bef-ac8f-4e24b9a3adff-kube-api-access-mpwwp\") pod \"redhat-operators-cbdqp\" (UID: \"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff\") " pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:11:34 crc kubenswrapper[4728]: I1216 15:11:34.540297 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:11:35 crc kubenswrapper[4728]: E1216 15:11:35.994974 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Dec 16 15:11:36 crc kubenswrapper[4728]: E1216 15:11:35.995703 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hjs4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-66f8b87655-qtz4v_openstack-operators(84531a1b-f019-449d-8779-05b03bde07cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:11:36 crc kubenswrapper[4728]: E1216 15:11:35.997001 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" podUID="84531a1b-f019-449d-8779-05b03bde07cb" Dec 16 15:11:37 crc kubenswrapper[4728]: E1216 15:11:37.871354 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a\\\"\"" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" podUID="84531a1b-f019-449d-8779-05b03bde07cb" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.401532 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp"] Dec 16 15:11:38 crc kubenswrapper[4728]: W1216 15:11:38.431974 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8ceccb7_c74c_42c4_a763_d947892f942d.slice/crio-4a032877f8df38b4f3c446db49d9003f0f2fc2c9b4dbed2311f99291137e8b72 WatchSource:0}: Error finding container 4a032877f8df38b4f3c446db49d9003f0f2fc2c9b4dbed2311f99291137e8b72: Status 404 returned error can't find the container with id 4a032877f8df38b4f3c446db49d9003f0f2fc2c9b4dbed2311f99291137e8b72 Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.500016 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl"] Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.572970 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cbdqp"] Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.670607 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9"] Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.795308 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7" event={"ID":"dbf95255-3fe3-4421-be60-212514fef21c","Type":"ContainerStarted","Data":"73e3bc064df03c4f254464a5d2b46fb6c67a70a84a87a69ee514c77271d9f604"} Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.796365 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.819810 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9" event={"ID":"90e228b6-e35d-4ee2-992c-364b4abd8436","Type":"ContainerStarted","Data":"333e63b9d6687fb9ee75bfc0987c1aa1686f26e86ac8a878d97c3758b3645240"} Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.820602 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.828974 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7" podStartSLOduration=4.598386661 podStartE2EDuration="23.828960055s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:16.632740546 +0000 UTC m=+857.472919530" lastFinishedPulling="2025-12-16 15:11:35.8633139 +0000 UTC m=+876.703492924" observedRunningTime="2025-12-16 15:11:38.817838459 +0000 UTC m=+879.658017443" watchObservedRunningTime="2025-12-16 15:11:38.828960055 +0000 UTC m=+879.669139039" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.849275 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" event={"ID":"0def48bf-646d-4641-93b5-a9e4e058cc67","Type":"ContainerStarted","Data":"68abe9b317de060140b7a05554322fba4bc37d1c999c002eab3dd61e2dc12282"} Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.850921 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h" event={"ID":"59a84980-fdf4-4ff3-b8c7-464e1423bad3","Type":"ContainerStarted","Data":"06b38ae0992fe056f143c5350badad9d1e37ac47d45f8deee1970ecd176fd3cb"} Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.852772 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.857780 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9" podStartSLOduration=2.527335274 podStartE2EDuration="23.857768238s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:16.638886575 +0000 UTC m=+857.479065559" lastFinishedPulling="2025-12-16 15:11:37.969319519 +0000 UTC m=+878.809498523" observedRunningTime="2025-12-16 15:11:38.856673118 +0000 UTC m=+879.696852102" watchObservedRunningTime="2025-12-16 15:11:38.857768238 +0000 UTC m=+879.697947222" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.868170 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.868204 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p" event={"ID":"75c9a0f4-94bc-4bf5-b164-149256d1a214","Type":"ContainerStarted","Data":"fb6b14fe2aad9765ade7d6b0081b2c28b71fe2cf6fef6cc02a24d59b2c6a424d"} Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.882017 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbdqp" event={"ID":"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff","Type":"ContainerStarted","Data":"1347bea142dd22b72dc457448189a3a2b45bd720775da8270a2304cbc95d9548"} Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.906732 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" event={"ID":"a8ceccb7-c74c-42c4-a763-d947892f942d","Type":"ContainerStarted","Data":"4a032877f8df38b4f3c446db49d9003f0f2fc2c9b4dbed2311f99291137e8b72"} Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.911370 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h" podStartSLOduration=2.7870642180000003 podStartE2EDuration="23.911352102s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:16.831865173 +0000 UTC m=+857.672044157" lastFinishedPulling="2025-12-16 15:11:37.956153057 +0000 UTC m=+878.796332041" observedRunningTime="2025-12-16 15:11:38.90293285 +0000 UTC m=+879.743111844" watchObservedRunningTime="2025-12-16 15:11:38.911352102 +0000 UTC m=+879.751531086" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.950303 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg" event={"ID":"89d4ec07-baef-4061-b6d8-e50f3ab47bb1","Type":"ContainerStarted","Data":"92bc8e1b0a419f3432a8b435a7a4b1684c0fc509958ac37b56648acc027b50f8"} Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.951330 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.965084 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf" event={"ID":"0252b186-dc46-4cca-ba92-9855cb2aa4ec","Type":"ContainerStarted","Data":"23e0eb7292044eac7309fcf74195fda5e5cddbe35946d8a790f779e1fa45f637"} Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.966384 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p" podStartSLOduration=3.045919698 podStartE2EDuration="23.966367855s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.036677437 +0000 UTC m=+857.876856431" lastFinishedPulling="2025-12-16 15:11:37.957125604 +0000 UTC m=+878.797304588" observedRunningTime="2025-12-16 15:11:38.962678444 +0000 UTC m=+879.802857428" watchObservedRunningTime="2025-12-16 15:11:38.966367855 +0000 UTC m=+879.806546829" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.968045 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.986643 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8" event={"ID":"66c25f6d-85c4-4e3e-bf44-93499cc2321c","Type":"ContainerStarted","Data":"71671b3e771fcc21b67b597ccdf4459e8b1f39f13dbf8bcc7f50471404e70fb3"} Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.987633 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8" Dec 16 15:11:38 crc kubenswrapper[4728]: I1216 15:11:38.994891 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg" podStartSLOduration=3.139501613 podStartE2EDuration="23.99487611s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.056452201 +0000 UTC m=+857.896631185" lastFinishedPulling="2025-12-16 15:11:37.911826688 +0000 UTC m=+878.752005682" observedRunningTime="2025-12-16 15:11:38.991935099 +0000 UTC m=+879.832114083" watchObservedRunningTime="2025-12-16 15:11:38.99487611 +0000 UTC m=+879.835055094" Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.003275 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5" event={"ID":"660d7a4f-e56a-42c8-8db6-d1f7285d7d04","Type":"ContainerStarted","Data":"46c8eee112fe514d29de96da0a2878b17a2677b2489bf38ddcb0877bc0b47f10"} Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.003762 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5" Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.008087 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" event={"ID":"a4b04d21-7de1-4565-99e6-fbeb59a0fde6","Type":"ContainerStarted","Data":"c8d896ed2bf456355a68c97c7a7299b878f08144772a7dd917598ca70182bd6c"} Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.013869 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc" event={"ID":"e501f8ed-3791-4661-8c3e-bfb4eaeeb64d","Type":"ContainerStarted","Data":"0c1cfb223c780ad4a443bf3aeab19fbd32f315c95113d73467c45098e8414a7b"} Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.014684 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc" Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.023245 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8" podStartSLOduration=3.321838488 podStartE2EDuration="24.023225719s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.254838378 +0000 UTC m=+858.095017362" lastFinishedPulling="2025-12-16 15:11:37.956225609 +0000 UTC m=+878.796404593" observedRunningTime="2025-12-16 15:11:39.019989501 +0000 UTC m=+879.860168485" watchObservedRunningTime="2025-12-16 15:11:39.023225719 +0000 UTC m=+879.863404703" Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.034574 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq" event={"ID":"7a8c4b97-2de8-4235-aa76-c8382c5c5cb1","Type":"ContainerStarted","Data":"df637a6942a82b75930adcef98fbdcfee737298165a94fcda17308ea79dbe7e9"} Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.034859 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq" Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.072961 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf" podStartSLOduration=5.007944948 podStartE2EDuration="24.072944066s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:16.799575766 +0000 UTC m=+857.639754750" lastFinishedPulling="2025-12-16 15:11:35.864574844 +0000 UTC m=+876.704753868" observedRunningTime="2025-12-16 15:11:39.044680929 +0000 UTC m=+879.884859903" watchObservedRunningTime="2025-12-16 15:11:39.072944066 +0000 UTC m=+879.913123050" Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.092110 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5" podStartSLOduration=2.930966906 podStartE2EDuration="24.092094293s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:16.795053781 +0000 UTC m=+857.635232765" lastFinishedPulling="2025-12-16 15:11:37.956181168 +0000 UTC m=+878.796360152" observedRunningTime="2025-12-16 15:11:39.072633368 +0000 UTC m=+879.912812352" watchObservedRunningTime="2025-12-16 15:11:39.092094293 +0000 UTC m=+879.932273277" Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.094943 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc" podStartSLOduration=3.17393814 podStartE2EDuration="24.094936611s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.037071998 +0000 UTC m=+857.877250982" lastFinishedPulling="2025-12-16 15:11:37.958070459 +0000 UTC m=+878.798249453" observedRunningTime="2025-12-16 15:11:39.086793617 +0000 UTC m=+879.926972601" watchObservedRunningTime="2025-12-16 15:11:39.094936611 +0000 UTC m=+879.935115595" Dec 16 15:11:39 crc kubenswrapper[4728]: I1216 15:11:39.111074 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq" podStartSLOduration=3.207595746 podStartE2EDuration="24.111053525s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.052697358 +0000 UTC m=+857.892876342" lastFinishedPulling="2025-12-16 15:11:37.956155127 +0000 UTC m=+878.796334121" observedRunningTime="2025-12-16 15:11:39.107011534 +0000 UTC m=+879.947190518" watchObservedRunningTime="2025-12-16 15:11:39.111053525 +0000 UTC m=+879.951232509" Dec 16 15:11:40 crc kubenswrapper[4728]: I1216 15:11:40.043506 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" event={"ID":"0def48bf-646d-4641-93b5-a9e4e058cc67","Type":"ContainerStarted","Data":"c1b4322ec5510b15f0d14beac3efe9833619f98ed4dd66f761529ff1022bade3"} Dec 16 15:11:40 crc kubenswrapper[4728]: I1216 15:11:40.043592 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:40 crc kubenswrapper[4728]: I1216 15:11:40.046580 4728 generic.go:334] "Generic (PLEG): container finished" podID="926ced6a-c5ef-4bef-ac8f-4e24b9a3adff" containerID="c787b0ff78bc910b750fddd66cf3d0ed4644aefd37e56ffe5f0f99352276dbd5" exitCode=0 Dec 16 15:11:40 crc kubenswrapper[4728]: I1216 15:11:40.046706 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbdqp" event={"ID":"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff","Type":"ContainerDied","Data":"c787b0ff78bc910b750fddd66cf3d0ed4644aefd37e56ffe5f0f99352276dbd5"} Dec 16 15:11:40 crc kubenswrapper[4728]: I1216 15:11:40.066927 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" podStartSLOduration=25.066893787 podStartE2EDuration="25.066893787s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:11:40.063831133 +0000 UTC m=+880.904010117" watchObservedRunningTime="2025-12-16 15:11:40.066893787 +0000 UTC m=+880.907072771" Dec 16 15:11:41 crc kubenswrapper[4728]: I1216 15:11:41.057995 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" event={"ID":"12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c","Type":"ContainerStarted","Data":"e74467457fa937dfecd7802d2dfaa93fef401c60237e990fa778c8599e000d72"} Dec 16 15:11:41 crc kubenswrapper[4728]: I1216 15:11:41.058688 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" Dec 16 15:11:41 crc kubenswrapper[4728]: I1216 15:11:41.077967 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" podStartSLOduration=2.2185817 podStartE2EDuration="26.077944878s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:16.298960315 +0000 UTC m=+857.139139299" lastFinishedPulling="2025-12-16 15:11:40.158323493 +0000 UTC m=+880.998502477" observedRunningTime="2025-12-16 15:11:41.074899655 +0000 UTC m=+881.915078649" watchObservedRunningTime="2025-12-16 15:11:41.077944878 +0000 UTC m=+881.918123872" Dec 16 15:11:45 crc kubenswrapper[4728]: I1216 15:11:45.523835 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-95949466-l6vt8" Dec 16 15:11:45 crc kubenswrapper[4728]: I1216 15:11:45.557907 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-6sdq7" Dec 16 15:11:45 crc kubenswrapper[4728]: I1216 15:11:45.724364 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-qpsk9" Dec 16 15:11:45 crc kubenswrapper[4728]: I1216 15:11:45.808875 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-hcdxf" Dec 16 15:11:45 crc kubenswrapper[4728]: I1216 15:11:45.833155 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-ttkv5" Dec 16 15:11:45 crc kubenswrapper[4728]: I1216 15:11:45.907150 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-mfc2h" Dec 16 15:11:45 crc kubenswrapper[4728]: I1216 15:11:45.916170 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-t6vdg" Dec 16 15:11:45 crc kubenswrapper[4728]: I1216 15:11:45.969559 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-j7jxc" Dec 16 15:11:45 crc kubenswrapper[4728]: I1216 15:11:45.989430 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-68vvq" Dec 16 15:11:46 crc kubenswrapper[4728]: I1216 15:11:46.117622 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-zcz8p" Dec 16 15:11:46 crc kubenswrapper[4728]: I1216 15:11:46.286493 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-dz6x8" Dec 16 15:11:49 crc kubenswrapper[4728]: I1216 15:11:49.755932 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jf5hh"] Dec 16 15:11:49 crc kubenswrapper[4728]: I1216 15:11:49.757860 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:11:49 crc kubenswrapper[4728]: I1216 15:11:49.764767 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jf5hh"] Dec 16 15:11:49 crc kubenswrapper[4728]: I1216 15:11:49.951497 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-catalog-content\") pod \"certified-operators-jf5hh\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:11:49 crc kubenswrapper[4728]: I1216 15:11:49.951560 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-utilities\") pod \"certified-operators-jf5hh\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:11:49 crc kubenswrapper[4728]: I1216 15:11:49.951642 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pczg\" (UniqueName: \"kubernetes.io/projected/06a939f4-82ee-43e0-8a85-ad8db9e76b64-kube-api-access-6pczg\") pod \"certified-operators-jf5hh\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:11:50 crc kubenswrapper[4728]: I1216 15:11:50.053567 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-catalog-content\") pod \"certified-operators-jf5hh\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:11:50 crc kubenswrapper[4728]: I1216 15:11:50.053628 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-utilities\") pod \"certified-operators-jf5hh\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:11:50 crc kubenswrapper[4728]: I1216 15:11:50.053705 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pczg\" (UniqueName: \"kubernetes.io/projected/06a939f4-82ee-43e0-8a85-ad8db9e76b64-kube-api-access-6pczg\") pod \"certified-operators-jf5hh\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:11:50 crc kubenswrapper[4728]: I1216 15:11:50.054242 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-catalog-content\") pod \"certified-operators-jf5hh\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:11:50 crc kubenswrapper[4728]: I1216 15:11:50.054482 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-utilities\") pod \"certified-operators-jf5hh\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:11:50 crc kubenswrapper[4728]: I1216 15:11:50.086231 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pczg\" (UniqueName: \"kubernetes.io/projected/06a939f4-82ee-43e0-8a85-ad8db9e76b64-kube-api-access-6pczg\") pod \"certified-operators-jf5hh\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:11:50 crc kubenswrapper[4728]: I1216 15:11:50.376497 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:11:50 crc kubenswrapper[4728]: I1216 15:11:50.507548 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:11:51 crc kubenswrapper[4728]: I1216 15:11:51.185060 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jf5hh"] Dec 16 15:11:51 crc kubenswrapper[4728]: W1216 15:11:51.336261 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a939f4_82ee_43e0_8a85_ad8db9e76b64.slice/crio-f5ea0e238b2bd0d383a7ec0355050fc04f60372709eec0235e9388b2e4f8a9b1 WatchSource:0}: Error finding container f5ea0e238b2bd0d383a7ec0355050fc04f60372709eec0235e9388b2e4f8a9b1: Status 404 returned error can't find the container with id f5ea0e238b2bd0d383a7ec0355050fc04f60372709eec0235e9388b2e4f8a9b1 Dec 16 15:11:51 crc kubenswrapper[4728]: I1216 15:11:51.955984 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-757cf4457b-v8kt9" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.156478 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp" event={"ID":"160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e","Type":"ContainerStarted","Data":"ebb80b9f71757321da309def4ac8edc42672861ffd55d925ae24dc4a54608e58"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.157880 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" event={"ID":"f155db6c-255a-4401-884a-b48825bb93c7","Type":"ContainerStarted","Data":"a2017cae0c71e950d4351c90304feb2ff5c1fbdc2f2598d805c914ae0b636bf9"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.158034 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.159228 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" event={"ID":"fe17017f-5157-4d72-bb40-58a456517c3e","Type":"ContainerStarted","Data":"41d73033a1c7b61c1afb9143e91984dbb61d857d9450bff14ee809efae9322e9"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.159441 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.160992 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbdqp" event={"ID":"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff","Type":"ContainerStarted","Data":"3cfb608a445a8cb870888d3b8b24c09ce3aea99a16f95bfae70ccf8848e07321"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.162027 4728 generic.go:334] "Generic (PLEG): container finished" podID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" containerID="752a9cb6661c8734813141abd4fb8a38b0e2a0366c1c3583b643fe735decb653" exitCode=0 Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.162053 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jf5hh" event={"ID":"06a939f4-82ee-43e0-8a85-ad8db9e76b64","Type":"ContainerDied","Data":"752a9cb6661c8734813141abd4fb8a38b0e2a0366c1c3583b643fe735decb653"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.162076 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jf5hh" event={"ID":"06a939f4-82ee-43e0-8a85-ad8db9e76b64","Type":"ContainerStarted","Data":"f5ea0e238b2bd0d383a7ec0355050fc04f60372709eec0235e9388b2e4f8a9b1"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.164083 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" event={"ID":"a8ceccb7-c74c-42c4-a763-d947892f942d","Type":"ContainerStarted","Data":"575dca02bd12f2153d5d26cbdb9cd32614b17ec673ff42c0453c06d731e2318d"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.164202 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.167723 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" event={"ID":"6b5beb20-1139-4774-8ea6-b5c951a6cbba","Type":"ContainerStarted","Data":"62d34e7026dbaf3080c85ef23c9ebc25644515ea4c212156eea411738af75ee9"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.168190 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.169032 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" event={"ID":"a4b04d21-7de1-4565-99e6-fbeb59a0fde6","Type":"ContainerStarted","Data":"08053baa1c3420924a2bf093c632ec4e38e17da7ff67d894c9818cbda7e197eb"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.169071 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.170510 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" event={"ID":"84531a1b-f019-449d-8779-05b03bde07cb","Type":"ContainerStarted","Data":"5ae47a0c8919e94a2b2cc5a00042b0bb15fc3ab0686129db1d308217a8096fc5"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.170674 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.171989 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" event={"ID":"0cc3d254-9633-4e63-91a8-719af70696f6","Type":"ContainerStarted","Data":"efa34548b2adc3cc617d654fcfb93c404284fcdc79581a816a0a5f011d86a5d4"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.172140 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.173280 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" event={"ID":"8324ae5e-23f8-4267-9822-a4ae37c7cd5a","Type":"ContainerStarted","Data":"5657de282f946ea4b5c8f66e3bd94b7d6ea5a3dfb1ebcaa06716eb1e1c891622"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.173846 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.175146 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" event={"ID":"9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4","Type":"ContainerStarted","Data":"e131e7c0add025e6235a8b188a28dc6d2a61a75db271f4b6238d6520b3c41124"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.175342 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.176652 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" event={"ID":"f5364dc6-650d-427d-aab6-c50ba3d69b75","Type":"ContainerStarted","Data":"da1ff399f3fd5ef00d5cc38328dee4e0d93639e56336a6a206294e4543cc5d18"} Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.176981 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.229223 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-f9pgp" podStartSLOduration=3.734934462 podStartE2EDuration="37.229203307s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.308160145 +0000 UTC m=+858.148339129" lastFinishedPulling="2025-12-16 15:11:50.80242897 +0000 UTC m=+891.642607974" observedRunningTime="2025-12-16 15:11:52.192758074 +0000 UTC m=+893.032937058" watchObservedRunningTime="2025-12-16 15:11:52.229203307 +0000 UTC m=+893.069382291" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.259569 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" podStartSLOduration=24.927614407 podStartE2EDuration="37.259550862s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:38.438714351 +0000 UTC m=+879.278893335" lastFinishedPulling="2025-12-16 15:11:50.770650776 +0000 UTC m=+891.610829790" observedRunningTime="2025-12-16 15:11:52.231037027 +0000 UTC m=+893.071216011" watchObservedRunningTime="2025-12-16 15:11:52.259550862 +0000 UTC m=+893.099729846" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.293107 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" podStartSLOduration=2.518177772 podStartE2EDuration="37.293086334s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:16.81175227 +0000 UTC m=+857.651931254" lastFinishedPulling="2025-12-16 15:11:51.586660832 +0000 UTC m=+892.426839816" observedRunningTime="2025-12-16 15:11:52.263717486 +0000 UTC m=+893.103896470" watchObservedRunningTime="2025-12-16 15:11:52.293086334 +0000 UTC m=+893.133265328" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.295152 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" podStartSLOduration=25.155936647 podStartE2EDuration="37.29514321s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:38.556904042 +0000 UTC m=+879.397083026" lastFinishedPulling="2025-12-16 15:11:50.696110605 +0000 UTC m=+891.536289589" observedRunningTime="2025-12-16 15:11:52.291051098 +0000 UTC m=+893.131230082" watchObservedRunningTime="2025-12-16 15:11:52.29514321 +0000 UTC m=+893.135322184" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.381434 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" podStartSLOduration=3.903742306 podStartE2EDuration="37.381391053s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.292384162 +0000 UTC m=+858.132563146" lastFinishedPulling="2025-12-16 15:11:50.770032899 +0000 UTC m=+891.610211893" observedRunningTime="2025-12-16 15:11:52.377917068 +0000 UTC m=+893.218096052" watchObservedRunningTime="2025-12-16 15:11:52.381391053 +0000 UTC m=+893.221570037" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.405706 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" podStartSLOduration=3.989970027 podStartE2EDuration="37.405690022s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.280291828 +0000 UTC m=+858.120470812" lastFinishedPulling="2025-12-16 15:11:50.696011803 +0000 UTC m=+891.536190807" observedRunningTime="2025-12-16 15:11:52.401447714 +0000 UTC m=+893.241626698" watchObservedRunningTime="2025-12-16 15:11:52.405690022 +0000 UTC m=+893.245869006" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.479740 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" podStartSLOduration=3.998978745 podStartE2EDuration="37.479725568s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.217366378 +0000 UTC m=+858.057545362" lastFinishedPulling="2025-12-16 15:11:50.698113041 +0000 UTC m=+891.538292185" observedRunningTime="2025-12-16 15:11:52.476130809 +0000 UTC m=+893.316309793" watchObservedRunningTime="2025-12-16 15:11:52.479725568 +0000 UTC m=+893.319904552" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.481391 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" podStartSLOduration=3.848922838 podStartE2EDuration="37.481382824s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.063429673 +0000 UTC m=+857.903608657" lastFinishedPulling="2025-12-16 15:11:50.695889649 +0000 UTC m=+891.536068643" observedRunningTime="2025-12-16 15:11:52.460653953 +0000 UTC m=+893.300832937" watchObservedRunningTime="2025-12-16 15:11:52.481382824 +0000 UTC m=+893.321561798" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.503339 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" podStartSLOduration=3.871510508 podStartE2EDuration="37.503321587s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.06405751 +0000 UTC m=+857.904236494" lastFinishedPulling="2025-12-16 15:11:50.695868579 +0000 UTC m=+891.536047573" observedRunningTime="2025-12-16 15:11:52.502489264 +0000 UTC m=+893.342668248" watchObservedRunningTime="2025-12-16 15:11:52.503321587 +0000 UTC m=+893.343500561" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.526439 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" podStartSLOduration=3.561632365 podStartE2EDuration="37.526422763s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:16.815089652 +0000 UTC m=+857.655268636" lastFinishedPulling="2025-12-16 15:11:50.77988003 +0000 UTC m=+891.620059034" observedRunningTime="2025-12-16 15:11:52.521197489 +0000 UTC m=+893.361376473" watchObservedRunningTime="2025-12-16 15:11:52.526422763 +0000 UTC m=+893.366601747" Dec 16 15:11:52 crc kubenswrapper[4728]: I1216 15:11:52.547686 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" podStartSLOduration=3.8158845279999998 podStartE2EDuration="37.547670276s" podCreationTimestamp="2025-12-16 15:11:15 +0000 UTC" firstStartedPulling="2025-12-16 15:11:17.049183792 +0000 UTC m=+857.889362776" lastFinishedPulling="2025-12-16 15:11:50.78096953 +0000 UTC m=+891.621148524" observedRunningTime="2025-12-16 15:11:52.54415457 +0000 UTC m=+893.384333564" watchObservedRunningTime="2025-12-16 15:11:52.547670276 +0000 UTC m=+893.387849260" Dec 16 15:11:54 crc kubenswrapper[4728]: I1216 15:11:54.192344 4728 generic.go:334] "Generic (PLEG): container finished" podID="926ced6a-c5ef-4bef-ac8f-4e24b9a3adff" containerID="3cfb608a445a8cb870888d3b8b24c09ce3aea99a16f95bfae70ccf8848e07321" exitCode=0 Dec 16 15:11:54 crc kubenswrapper[4728]: I1216 15:11:54.192438 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbdqp" event={"ID":"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff","Type":"ContainerDied","Data":"3cfb608a445a8cb870888d3b8b24c09ce3aea99a16f95bfae70ccf8848e07321"} Dec 16 15:11:56 crc kubenswrapper[4728]: I1216 15:11:56.174886 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-xvvjw" Dec 16 15:11:56 crc kubenswrapper[4728]: I1216 15:11:56.210169 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-n6x46" Dec 16 15:11:56 crc kubenswrapper[4728]: I1216 15:11:56.243294 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-9p2mz" Dec 16 15:12:00 crc kubenswrapper[4728]: I1216 15:12:00.244032 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jf5hh" event={"ID":"06a939f4-82ee-43e0-8a85-ad8db9e76b64","Type":"ContainerStarted","Data":"8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017"} Dec 16 15:12:01 crc kubenswrapper[4728]: I1216 15:12:01.256785 4728 generic.go:334] "Generic (PLEG): container finished" podID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" containerID="8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017" exitCode=0 Dec 16 15:12:01 crc kubenswrapper[4728]: I1216 15:12:01.256953 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jf5hh" event={"ID":"06a939f4-82ee-43e0-8a85-ad8db9e76b64","Type":"ContainerDied","Data":"8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017"} Dec 16 15:12:01 crc kubenswrapper[4728]: I1216 15:12:01.562566 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl" Dec 16 15:12:01 crc kubenswrapper[4728]: I1216 15:12:01.709652 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-84b495f78-ljkxp" Dec 16 15:12:02 crc kubenswrapper[4728]: E1216 15:12:02.590592 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a939f4_82ee_43e0_8a85_ad8db9e76b64.slice/crio-8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:12:05 crc kubenswrapper[4728]: I1216 15:12:05.586713 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-qtz4v" Dec 16 15:12:05 crc kubenswrapper[4728]: I1216 15:12:05.630791 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-wn6qf" Dec 16 15:12:05 crc kubenswrapper[4728]: I1216 15:12:05.894301 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-mns5x" Dec 16 15:12:05 crc kubenswrapper[4728]: I1216 15:12:05.932793 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-xmv9j" Dec 16 15:12:05 crc kubenswrapper[4728]: I1216 15:12:05.951577 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-4jbw4" Dec 16 15:12:12 crc kubenswrapper[4728]: E1216 15:12:12.841830 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a939f4_82ee_43e0_8a85_ad8db9e76b64.slice/crio-8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.357792 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbdqp" event={"ID":"926ced6a-c5ef-4bef-ac8f-4e24b9a3adff","Type":"ContainerStarted","Data":"345d53575f117fabcc2b30f39e84cf251a48984298a11e9051a958e9e359dcf1"} Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.404496 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cbdqp" podStartSLOduration=6.894138414 podStartE2EDuration="39.404477074s" podCreationTimestamp="2025-12-16 15:11:34 +0000 UTC" firstStartedPulling="2025-12-16 15:11:40.048462721 +0000 UTC m=+880.888641705" lastFinishedPulling="2025-12-16 15:12:12.558801381 +0000 UTC m=+913.398980365" observedRunningTime="2025-12-16 15:12:13.401128021 +0000 UTC m=+914.241307095" watchObservedRunningTime="2025-12-16 15:12:13.404477074 +0000 UTC m=+914.244656058" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.614682 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgbc"] Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.619469 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.632459 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgbc"] Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.771141 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-catalog-content\") pod \"redhat-marketplace-hdgbc\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.771192 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-utilities\") pod \"redhat-marketplace-hdgbc\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.771233 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc6sk\" (UniqueName: \"kubernetes.io/projected/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-kube-api-access-pc6sk\") pod \"redhat-marketplace-hdgbc\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.872962 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-catalog-content\") pod \"redhat-marketplace-hdgbc\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.873037 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-utilities\") pod \"redhat-marketplace-hdgbc\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.873102 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc6sk\" (UniqueName: \"kubernetes.io/projected/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-kube-api-access-pc6sk\") pod \"redhat-marketplace-hdgbc\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.873463 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-catalog-content\") pod \"redhat-marketplace-hdgbc\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.873555 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-utilities\") pod \"redhat-marketplace-hdgbc\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.903144 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc6sk\" (UniqueName: \"kubernetes.io/projected/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-kube-api-access-pc6sk\") pod \"redhat-marketplace-hdgbc\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:13 crc kubenswrapper[4728]: I1216 15:12:13.969361 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:14 crc kubenswrapper[4728]: I1216 15:12:14.375694 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jf5hh" event={"ID":"06a939f4-82ee-43e0-8a85-ad8db9e76b64","Type":"ContainerStarted","Data":"5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2"} Dec 16 15:12:14 crc kubenswrapper[4728]: I1216 15:12:14.410093 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jf5hh" podStartSLOduration=4.572542458 podStartE2EDuration="25.410070885s" podCreationTimestamp="2025-12-16 15:11:49 +0000 UTC" firstStartedPulling="2025-12-16 15:11:52.163324784 +0000 UTC m=+893.003503768" lastFinishedPulling="2025-12-16 15:12:13.000853191 +0000 UTC m=+913.841032195" observedRunningTime="2025-12-16 15:12:14.402844465 +0000 UTC m=+915.243023459" watchObservedRunningTime="2025-12-16 15:12:14.410070885 +0000 UTC m=+915.250249869" Dec 16 15:12:14 crc kubenswrapper[4728]: W1216 15:12:14.464767 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda03b0574_2d3f_49e9_9eb8_bc52c6d13f92.slice/crio-48bf6c4ed8fbd8f95dd5bd5175a95db3efd8e1b89ac797a62e6a27f01c7c4268 WatchSource:0}: Error finding container 48bf6c4ed8fbd8f95dd5bd5175a95db3efd8e1b89ac797a62e6a27f01c7c4268: Status 404 returned error can't find the container with id 48bf6c4ed8fbd8f95dd5bd5175a95db3efd8e1b89ac797a62e6a27f01c7c4268 Dec 16 15:12:14 crc kubenswrapper[4728]: I1216 15:12:14.466199 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgbc"] Dec 16 15:12:14 crc kubenswrapper[4728]: I1216 15:12:14.540545 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:12:14 crc kubenswrapper[4728]: I1216 15:12:14.540602 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:12:15 crc kubenswrapper[4728]: I1216 15:12:15.382958 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgbc" event={"ID":"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92","Type":"ContainerStarted","Data":"6e4dcc79a678f81de7a054b264ab83080b66de384dee9fddb1d64aea61cea70f"} Dec 16 15:12:15 crc kubenswrapper[4728]: I1216 15:12:15.383222 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgbc" event={"ID":"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92","Type":"ContainerStarted","Data":"48bf6c4ed8fbd8f95dd5bd5175a95db3efd8e1b89ac797a62e6a27f01c7c4268"} Dec 16 15:12:15 crc kubenswrapper[4728]: I1216 15:12:15.597681 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cbdqp" podUID="926ced6a-c5ef-4bef-ac8f-4e24b9a3adff" containerName="registry-server" probeResult="failure" output=< Dec 16 15:12:15 crc kubenswrapper[4728]: timeout: failed to connect service ":50051" within 1s Dec 16 15:12:15 crc kubenswrapper[4728]: > Dec 16 15:12:16 crc kubenswrapper[4728]: I1216 15:12:16.391465 4728 generic.go:334] "Generic (PLEG): container finished" podID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerID="6e4dcc79a678f81de7a054b264ab83080b66de384dee9fddb1d64aea61cea70f" exitCode=0 Dec 16 15:12:16 crc kubenswrapper[4728]: I1216 15:12:16.391517 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgbc" event={"ID":"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92","Type":"ContainerDied","Data":"6e4dcc79a678f81de7a054b264ab83080b66de384dee9fddb1d64aea61cea70f"} Dec 16 15:12:18 crc kubenswrapper[4728]: I1216 15:12:18.405278 4728 generic.go:334] "Generic (PLEG): container finished" podID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerID="d3d99354a7acf56b51b53f5caedcfed42c28cb02f9bb765a9b188a73f155ab88" exitCode=0 Dec 16 15:12:18 crc kubenswrapper[4728]: I1216 15:12:18.405324 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgbc" event={"ID":"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92","Type":"ContainerDied","Data":"d3d99354a7acf56b51b53f5caedcfed42c28cb02f9bb765a9b188a73f155ab88"} Dec 16 15:12:19 crc kubenswrapper[4728]: I1216 15:12:19.421520 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgbc" event={"ID":"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92","Type":"ContainerStarted","Data":"6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2"} Dec 16 15:12:19 crc kubenswrapper[4728]: I1216 15:12:19.441314 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hdgbc" podStartSLOduration=3.9707562210000003 podStartE2EDuration="6.441295109s" podCreationTimestamp="2025-12-16 15:12:13 +0000 UTC" firstStartedPulling="2025-12-16 15:12:16.393148752 +0000 UTC m=+917.233327746" lastFinishedPulling="2025-12-16 15:12:18.86368764 +0000 UTC m=+919.703866634" observedRunningTime="2025-12-16 15:12:19.438045359 +0000 UTC m=+920.278224333" watchObservedRunningTime="2025-12-16 15:12:19.441295109 +0000 UTC m=+920.281474093" Dec 16 15:12:20 crc kubenswrapper[4728]: I1216 15:12:20.377313 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:12:20 crc kubenswrapper[4728]: I1216 15:12:20.377649 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:12:20 crc kubenswrapper[4728]: I1216 15:12:20.421988 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:12:20 crc kubenswrapper[4728]: I1216 15:12:20.472335 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:12:20 crc kubenswrapper[4728]: I1216 15:12:20.804110 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jf5hh"] Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.061456 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-glhkl"] Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.063093 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.065047 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.065344 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wpqph" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.066012 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.066599 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.069543 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-glhkl"] Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.124341 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4qj54"] Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.125433 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.127625 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.164224 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4qj54"] Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.175095 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tp8v\" (UniqueName: \"kubernetes.io/projected/c465bc05-dd37-47cb-9a67-d5889552e647-kube-api-access-4tp8v\") pod \"dnsmasq-dns-675f4bcbfc-glhkl\" (UID: \"c465bc05-dd37-47cb-9a67-d5889552e647\") " pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.175178 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c465bc05-dd37-47cb-9a67-d5889552e647-config\") pod \"dnsmasq-dns-675f4bcbfc-glhkl\" (UID: \"c465bc05-dd37-47cb-9a67-d5889552e647\") " pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.276275 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4qj54\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.276355 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-config\") pod \"dnsmasq-dns-78dd6ddcc-4qj54\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.276388 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tp8v\" (UniqueName: \"kubernetes.io/projected/c465bc05-dd37-47cb-9a67-d5889552e647-kube-api-access-4tp8v\") pod \"dnsmasq-dns-675f4bcbfc-glhkl\" (UID: \"c465bc05-dd37-47cb-9a67-d5889552e647\") " pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.276440 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nkpt\" (UniqueName: \"kubernetes.io/projected/33ac9add-559e-4408-86e6-39ecc8732b0c-kube-api-access-9nkpt\") pod \"dnsmasq-dns-78dd6ddcc-4qj54\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.276541 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c465bc05-dd37-47cb-9a67-d5889552e647-config\") pod \"dnsmasq-dns-675f4bcbfc-glhkl\" (UID: \"c465bc05-dd37-47cb-9a67-d5889552e647\") " pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.277510 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c465bc05-dd37-47cb-9a67-d5889552e647-config\") pod \"dnsmasq-dns-675f4bcbfc-glhkl\" (UID: \"c465bc05-dd37-47cb-9a67-d5889552e647\") " pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.303059 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tp8v\" (UniqueName: \"kubernetes.io/projected/c465bc05-dd37-47cb-9a67-d5889552e647-kube-api-access-4tp8v\") pod \"dnsmasq-dns-675f4bcbfc-glhkl\" (UID: \"c465bc05-dd37-47cb-9a67-d5889552e647\") " pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.377691 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-config\") pod \"dnsmasq-dns-78dd6ddcc-4qj54\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.377751 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nkpt\" (UniqueName: \"kubernetes.io/projected/33ac9add-559e-4408-86e6-39ecc8732b0c-kube-api-access-9nkpt\") pod \"dnsmasq-dns-78dd6ddcc-4qj54\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.377796 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4qj54\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.378511 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4qj54\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.378590 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-config\") pod \"dnsmasq-dns-78dd6ddcc-4qj54\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.387489 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.398342 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nkpt\" (UniqueName: \"kubernetes.io/projected/33ac9add-559e-4408-86e6-39ecc8732b0c-kube-api-access-9nkpt\") pod \"dnsmasq-dns-78dd6ddcc-4qj54\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.466867 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.864191 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-glhkl"] Dec 16 15:12:21 crc kubenswrapper[4728]: W1216 15:12:21.868017 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc465bc05_dd37_47cb_9a67_d5889552e647.slice/crio-af591bc06fc2d87860afa62d84d5cb42e6083338e8500954bec9aeaebf988069 WatchSource:0}: Error finding container af591bc06fc2d87860afa62d84d5cb42e6083338e8500954bec9aeaebf988069: Status 404 returned error can't find the container with id af591bc06fc2d87860afa62d84d5cb42e6083338e8500954bec9aeaebf988069 Dec 16 15:12:21 crc kubenswrapper[4728]: I1216 15:12:21.931509 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4qj54"] Dec 16 15:12:21 crc kubenswrapper[4728]: W1216 15:12:21.937989 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33ac9add_559e_4408_86e6_39ecc8732b0c.slice/crio-e04b0586cdf43e5393c8aafb6d1ef4edc5474c7ac835645d593af2c7f1ecfa92 WatchSource:0}: Error finding container e04b0586cdf43e5393c8aafb6d1ef4edc5474c7ac835645d593af2c7f1ecfa92: Status 404 returned error can't find the container with id e04b0586cdf43e5393c8aafb6d1ef4edc5474c7ac835645d593af2c7f1ecfa92 Dec 16 15:12:22 crc kubenswrapper[4728]: I1216 15:12:22.448296 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" event={"ID":"c465bc05-dd37-47cb-9a67-d5889552e647","Type":"ContainerStarted","Data":"af591bc06fc2d87860afa62d84d5cb42e6083338e8500954bec9aeaebf988069"} Dec 16 15:12:22 crc kubenswrapper[4728]: I1216 15:12:22.450105 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" event={"ID":"33ac9add-559e-4408-86e6-39ecc8732b0c","Type":"ContainerStarted","Data":"e04b0586cdf43e5393c8aafb6d1ef4edc5474c7ac835645d593af2c7f1ecfa92"} Dec 16 15:12:22 crc kubenswrapper[4728]: I1216 15:12:22.450311 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jf5hh" podUID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" containerName="registry-server" containerID="cri-o://5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2" gracePeriod=2 Dec 16 15:12:22 crc kubenswrapper[4728]: I1216 15:12:22.889275 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.003840 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-utilities\") pod \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.003939 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pczg\" (UniqueName: \"kubernetes.io/projected/06a939f4-82ee-43e0-8a85-ad8db9e76b64-kube-api-access-6pczg\") pod \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.004061 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-catalog-content\") pod \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\" (UID: \"06a939f4-82ee-43e0-8a85-ad8db9e76b64\") " Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.004605 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-utilities" (OuterVolumeSpecName: "utilities") pod "06a939f4-82ee-43e0-8a85-ad8db9e76b64" (UID: "06a939f4-82ee-43e0-8a85-ad8db9e76b64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.013312 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a939f4-82ee-43e0-8a85-ad8db9e76b64-kube-api-access-6pczg" (OuterVolumeSpecName: "kube-api-access-6pczg") pod "06a939f4-82ee-43e0-8a85-ad8db9e76b64" (UID: "06a939f4-82ee-43e0-8a85-ad8db9e76b64"). InnerVolumeSpecName "kube-api-access-6pczg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:12:23 crc kubenswrapper[4728]: E1216 15:12:23.033046 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a939f4_82ee_43e0_8a85_ad8db9e76b64.slice/crio-8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.058332 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06a939f4-82ee-43e0-8a85-ad8db9e76b64" (UID: "06a939f4-82ee-43e0-8a85-ad8db9e76b64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.105975 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.106004 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pczg\" (UniqueName: \"kubernetes.io/projected/06a939f4-82ee-43e0-8a85-ad8db9e76b64-kube-api-access-6pczg\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.106016 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06a939f4-82ee-43e0-8a85-ad8db9e76b64-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.464806 4728 generic.go:334] "Generic (PLEG): container finished" podID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" containerID="5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2" exitCode=0 Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.464848 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jf5hh" event={"ID":"06a939f4-82ee-43e0-8a85-ad8db9e76b64","Type":"ContainerDied","Data":"5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2"} Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.464872 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jf5hh" event={"ID":"06a939f4-82ee-43e0-8a85-ad8db9e76b64","Type":"ContainerDied","Data":"f5ea0e238b2bd0d383a7ec0355050fc04f60372709eec0235e9388b2e4f8a9b1"} Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.464875 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jf5hh" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.464887 4728 scope.go:117] "RemoveContainer" containerID="5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.485249 4728 scope.go:117] "RemoveContainer" containerID="8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.504832 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jf5hh"] Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.518094 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jf5hh"] Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.546875 4728 scope.go:117] "RemoveContainer" containerID="752a9cb6661c8734813141abd4fb8a38b0e2a0366c1c3583b643fe735decb653" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.568813 4728 scope.go:117] "RemoveContainer" containerID="5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2" Dec 16 15:12:23 crc kubenswrapper[4728]: E1216 15:12:23.569364 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2\": container with ID starting with 5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2 not found: ID does not exist" containerID="5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.569428 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2"} err="failed to get container status \"5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2\": rpc error: code = NotFound desc = could not find container \"5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2\": container with ID starting with 5832aebd9e1a51e6e5ba5aa50413c7ec84b205e64742a2b4027ee751b2ea5db2 not found: ID does not exist" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.569456 4728 scope.go:117] "RemoveContainer" containerID="8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017" Dec 16 15:12:23 crc kubenswrapper[4728]: E1216 15:12:23.570843 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017\": container with ID starting with 8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017 not found: ID does not exist" containerID="8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.570882 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017"} err="failed to get container status \"8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017\": rpc error: code = NotFound desc = could not find container \"8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017\": container with ID starting with 8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017 not found: ID does not exist" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.570921 4728 scope.go:117] "RemoveContainer" containerID="752a9cb6661c8734813141abd4fb8a38b0e2a0366c1c3583b643fe735decb653" Dec 16 15:12:23 crc kubenswrapper[4728]: E1216 15:12:23.571298 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"752a9cb6661c8734813141abd4fb8a38b0e2a0366c1c3583b643fe735decb653\": container with ID starting with 752a9cb6661c8734813141abd4fb8a38b0e2a0366c1c3583b643fe735decb653 not found: ID does not exist" containerID="752a9cb6661c8734813141abd4fb8a38b0e2a0366c1c3583b643fe735decb653" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.571328 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752a9cb6661c8734813141abd4fb8a38b0e2a0366c1c3583b643fe735decb653"} err="failed to get container status \"752a9cb6661c8734813141abd4fb8a38b0e2a0366c1c3583b643fe735decb653\": rpc error: code = NotFound desc = could not find container \"752a9cb6661c8734813141abd4fb8a38b0e2a0366c1c3583b643fe735decb653\": container with ID starting with 752a9cb6661c8734813141abd4fb8a38b0e2a0366c1c3583b643fe735decb653 not found: ID does not exist" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.969726 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:23 crc kubenswrapper[4728]: I1216 15:12:23.969791 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.023221 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.183899 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-glhkl"] Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.219089 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-b86k8"] Dec 16 15:12:24 crc kubenswrapper[4728]: E1216 15:12:24.219426 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" containerName="registry-server" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.219444 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" containerName="registry-server" Dec 16 15:12:24 crc kubenswrapper[4728]: E1216 15:12:24.219457 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" containerName="extract-utilities" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.219463 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" containerName="extract-utilities" Dec 16 15:12:24 crc kubenswrapper[4728]: E1216 15:12:24.219477 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" containerName="extract-content" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.219483 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" containerName="extract-content" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.219611 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" containerName="registry-server" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.220276 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.239797 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-b86k8"] Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.321926 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-dns-svc\") pod \"dnsmasq-dns-666b6646f7-b86k8\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.322048 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-config\") pod \"dnsmasq-dns-666b6646f7-b86k8\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.322084 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zptbz\" (UniqueName: \"kubernetes.io/projected/2db861e2-eceb-4128-a546-8c34cc829276-kube-api-access-zptbz\") pod \"dnsmasq-dns-666b6646f7-b86k8\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.423199 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-dns-svc\") pod \"dnsmasq-dns-666b6646f7-b86k8\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.423251 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-config\") pod \"dnsmasq-dns-666b6646f7-b86k8\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.423289 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zptbz\" (UniqueName: \"kubernetes.io/projected/2db861e2-eceb-4128-a546-8c34cc829276-kube-api-access-zptbz\") pod \"dnsmasq-dns-666b6646f7-b86k8\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.424913 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-config\") pod \"dnsmasq-dns-666b6646f7-b86k8\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.425156 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-dns-svc\") pod \"dnsmasq-dns-666b6646f7-b86k8\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.465531 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zptbz\" (UniqueName: \"kubernetes.io/projected/2db861e2-eceb-4128-a546-8c34cc829276-kube-api-access-zptbz\") pod \"dnsmasq-dns-666b6646f7-b86k8\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.495447 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4qj54"] Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.510854 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-84twp"] Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.512335 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.523907 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-84twp"] Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.539494 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.563655 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.615838 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.626888 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-config\") pod \"dnsmasq-dns-57d769cc4f-84twp\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.627195 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-84twp\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.627327 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2wjl\" (UniqueName: \"kubernetes.io/projected/7136f947-a2e9-45b7-8add-348387eb9645-kube-api-access-n2wjl\") pod \"dnsmasq-dns-57d769cc4f-84twp\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.686152 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cbdqp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.728239 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-config\") pod \"dnsmasq-dns-57d769cc4f-84twp\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.728305 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-84twp\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.728327 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2wjl\" (UniqueName: \"kubernetes.io/projected/7136f947-a2e9-45b7-8add-348387eb9645-kube-api-access-n2wjl\") pod \"dnsmasq-dns-57d769cc4f-84twp\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.730008 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-84twp\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.730010 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-config\") pod \"dnsmasq-dns-57d769cc4f-84twp\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.773522 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2wjl\" (UniqueName: \"kubernetes.io/projected/7136f947-a2e9-45b7-8add-348387eb9645-kube-api-access-n2wjl\") pod \"dnsmasq-dns-57d769cc4f-84twp\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:24 crc kubenswrapper[4728]: I1216 15:12:24.834232 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.156386 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-b86k8"] Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.356434 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.357779 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.359982 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.360179 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.360453 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z52lk" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.360514 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.360529 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.360556 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.360800 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.392307 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.437752 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.437796 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.437815 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.437832 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.437855 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzml2\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-kube-api-access-rzml2\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.437876 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-config-data\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.437960 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42b12213-b2ec-4fa5-b848-d06fe7855247-pod-info\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.437978 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.438035 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.438060 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42b12213-b2ec-4fa5-b848-d06fe7855247-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.438101 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-server-conf\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.521023 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a939f4-82ee-43e0-8a85-ad8db9e76b64" path="/var/lib/kubelet/pods/06a939f4-82ee-43e0-8a85-ad8db9e76b64/volumes" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.538540 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42b12213-b2ec-4fa5-b848-d06fe7855247-pod-info\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.538581 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.538608 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.538638 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42b12213-b2ec-4fa5-b848-d06fe7855247-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.538677 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-server-conf\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.538721 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.538736 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.538751 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.538767 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.538783 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzml2\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-kube-api-access-rzml2\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.538802 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-config-data\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.539632 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-config-data\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.539818 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.539848 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.540248 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.540644 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.541142 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-server-conf\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.546164 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.552247 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.555800 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42b12213-b2ec-4fa5-b848-d06fe7855247-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.555884 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42b12213-b2ec-4fa5-b848-d06fe7855247-pod-info\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.559442 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzml2\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-kube-api-access-rzml2\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.559566 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.637369 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.639435 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.641240 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.643053 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.643143 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.643064 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.643317 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tl7xr" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.644210 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.644226 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.645294 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.678301 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.742347 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.742387 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.742436 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.742455 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31e565e7-a84a-436e-bc5d-dc107a42ef0f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.742575 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.742648 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.742716 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.742760 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.742818 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.742885 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw574\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-kube-api-access-dw574\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.742940 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31e565e7-a84a-436e-bc5d-dc107a42ef0f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.844690 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.844745 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.844789 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.844827 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw574\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-kube-api-access-dw574\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.844855 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31e565e7-a84a-436e-bc5d-dc107a42ef0f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.844913 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.845024 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.845052 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.845074 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.845095 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31e565e7-a84a-436e-bc5d-dc107a42ef0f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.845116 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.845136 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.845311 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.846250 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.846689 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.847424 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.848443 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.848983 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.849713 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31e565e7-a84a-436e-bc5d-dc107a42ef0f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.849948 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.862361 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31e565e7-a84a-436e-bc5d-dc107a42ef0f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.875026 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw574\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-kube-api-access-dw574\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.910706 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:25 crc kubenswrapper[4728]: I1216 15:12:25.967275 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:12:26 crc kubenswrapper[4728]: I1216 15:12:26.401088 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgbc"] Dec 16 15:12:26 crc kubenswrapper[4728]: I1216 15:12:26.490778 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hdgbc" podUID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerName="registry-server" containerID="cri-o://6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2" gracePeriod=2 Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.196192 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.197332 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.199206 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.201259 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.201823 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wlrzx" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.201865 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.212723 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.215519 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.272349 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb629e93-c552-47c3-8c89-11254ffa834f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.272435 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eb629e93-c552-47c3-8c89-11254ffa834f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.272461 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eb629e93-c552-47c3-8c89-11254ffa834f-config-data-default\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.272509 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-927gl\" (UniqueName: \"kubernetes.io/projected/eb629e93-c552-47c3-8c89-11254ffa834f-kube-api-access-927gl\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.272532 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb629e93-c552-47c3-8c89-11254ffa834f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.272555 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb629e93-c552-47c3-8c89-11254ffa834f-kolla-config\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.272682 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.272713 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb629e93-c552-47c3-8c89-11254ffa834f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.373639 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-927gl\" (UniqueName: \"kubernetes.io/projected/eb629e93-c552-47c3-8c89-11254ffa834f-kube-api-access-927gl\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.373686 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb629e93-c552-47c3-8c89-11254ffa834f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.373708 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb629e93-c552-47c3-8c89-11254ffa834f-kolla-config\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.373767 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.373785 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb629e93-c552-47c3-8c89-11254ffa834f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.373828 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb629e93-c552-47c3-8c89-11254ffa834f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.373860 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eb629e93-c552-47c3-8c89-11254ffa834f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.373875 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eb629e93-c552-47c3-8c89-11254ffa834f-config-data-default\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.374674 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eb629e93-c552-47c3-8c89-11254ffa834f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.374835 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.375098 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eb629e93-c552-47c3-8c89-11254ffa834f-kolla-config\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.375526 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eb629e93-c552-47c3-8c89-11254ffa834f-config-data-default\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.376017 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb629e93-c552-47c3-8c89-11254ffa834f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.378738 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb629e93-c552-47c3-8c89-11254ffa834f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.384399 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb629e93-c552-47c3-8c89-11254ffa834f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.392485 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-927gl\" (UniqueName: \"kubernetes.io/projected/eb629e93-c552-47c3-8c89-11254ffa834f-kube-api-access-927gl\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.398046 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"eb629e93-c552-47c3-8c89-11254ffa834f\") " pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.511103 4728 generic.go:334] "Generic (PLEG): container finished" podID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerID="6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2" exitCode=0 Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.517730 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgbc" event={"ID":"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92","Type":"ContainerDied","Data":"6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2"} Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.540267 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 16 15:12:27 crc kubenswrapper[4728]: I1216 15:12:27.830176 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cbdqp"] Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.204371 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jh2cv"] Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.205669 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jh2cv" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerName="registry-server" containerID="cri-o://26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6" gracePeriod=2 Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.498092 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.499201 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: W1216 15:12:28.500963 4728 reflector.go:561] object-"openstack"/"galera-openstack-cell1-dockercfg-s5wnf": failed to list *v1.Secret: secrets "galera-openstack-cell1-dockercfg-s5wnf" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 15:12:28 crc kubenswrapper[4728]: E1216 15:12:28.501021 4728 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"galera-openstack-cell1-dockercfg-s5wnf\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"galera-openstack-cell1-dockercfg-s5wnf\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 15:12:28 crc kubenswrapper[4728]: W1216 15:12:28.500982 4728 reflector.go:561] object-"openstack"/"cert-galera-openstack-cell1-svc": failed to list *v1.Secret: secrets "cert-galera-openstack-cell1-svc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 15:12:28 crc kubenswrapper[4728]: E1216 15:12:28.501063 4728 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cert-galera-openstack-cell1-svc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-galera-openstack-cell1-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 15:12:28 crc kubenswrapper[4728]: W1216 15:12:28.501607 4728 reflector.go:561] object-"openstack"/"openstack-cell1-scripts": failed to list *v1.ConfigMap: configmaps "openstack-cell1-scripts" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 15:12:28 crc kubenswrapper[4728]: E1216 15:12:28.501644 4728 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-cell1-scripts\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openstack-cell1-scripts\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 15:12:28 crc kubenswrapper[4728]: W1216 15:12:28.502224 4728 reflector.go:561] object-"openstack"/"openstack-cell1-config-data": failed to list *v1.ConfigMap: configmaps "openstack-cell1-config-data" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 15:12:28 crc kubenswrapper[4728]: E1216 15:12:28.502258 4728 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-cell1-config-data\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openstack-cell1-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.517773 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.542420 4728 generic.go:334] "Generic (PLEG): container finished" podID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerID="26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6" exitCode=0 Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.542447 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh2cv" event={"ID":"91e5f218-48b8-47f0-825c-f9eea263b64c","Type":"ContainerDied","Data":"26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6"} Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.692608 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/76f2644a-8bb9-4719-83dd-429202a52446-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.692719 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.692788 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/76f2644a-8bb9-4719-83dd-429202a52446-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.692823 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.692890 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.692963 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76f2644a-8bb9-4719-83dd-429202a52446-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.692988 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.693033 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbnl4\" (UniqueName: \"kubernetes.io/projected/76f2644a-8bb9-4719-83dd-429202a52446-kube-api-access-lbnl4\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.794615 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.794675 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.794720 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.794743 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76f2644a-8bb9-4719-83dd-429202a52446-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.794767 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbnl4\" (UniqueName: \"kubernetes.io/projected/76f2644a-8bb9-4719-83dd-429202a52446-kube-api-access-lbnl4\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.794861 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/76f2644a-8bb9-4719-83dd-429202a52446-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.794918 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.794996 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/76f2644a-8bb9-4719-83dd-429202a52446-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.795078 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.795383 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/76f2644a-8bb9-4719-83dd-429202a52446-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.801056 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76f2644a-8bb9-4719-83dd-429202a52446-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.824058 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbnl4\" (UniqueName: \"kubernetes.io/projected/76f2644a-8bb9-4719-83dd-429202a52446-kube-api-access-lbnl4\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.827974 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.945289 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 16 15:12:28 crc kubenswrapper[4728]: I1216 15:12:28.946948 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 15:12:28 crc kubenswrapper[4728]: W1216 15:12:28.950889 4728 reflector.go:561] object-"openstack"/"memcached-memcached-dockercfg-cdqtv": failed to list *v1.Secret: secrets "memcached-memcached-dockercfg-cdqtv" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 15:12:28 crc kubenswrapper[4728]: E1216 15:12:28.950953 4728 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"memcached-memcached-dockercfg-cdqtv\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"memcached-memcached-dockercfg-cdqtv\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 15:12:28 crc kubenswrapper[4728]: W1216 15:12:28.957118 4728 reflector.go:561] object-"openstack"/"cert-memcached-svc": failed to list *v1.Secret: secrets "cert-memcached-svc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 15:12:28 crc kubenswrapper[4728]: E1216 15:12:28.957213 4728 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cert-memcached-svc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cert-memcached-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 15:12:28 crc kubenswrapper[4728]: W1216 15:12:28.957328 4728 reflector.go:561] object-"openstack"/"memcached-config-data": failed to list *v1.ConfigMap: configmaps "memcached-config-data" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 16 15:12:28 crc kubenswrapper[4728]: E1216 15:12:28.957345 4728 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"memcached-config-data\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"memcached-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.003501 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.098523 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-kolla-config\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.098631 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf2b12c-4959-429e-b9db-173f5ddfab90-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.098673 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf2b12c-4959-429e-b9db-173f5ddfab90-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.098723 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-config-data\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.098790 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2n6k\" (UniqueName: \"kubernetes.io/projected/8cf2b12c-4959-429e-b9db-173f5ddfab90-kube-api-access-m2n6k\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.200415 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-config-data\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.200513 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2n6k\" (UniqueName: \"kubernetes.io/projected/8cf2b12c-4959-429e-b9db-173f5ddfab90-kube-api-access-m2n6k\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.200550 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-kolla-config\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.200587 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf2b12c-4959-429e-b9db-173f5ddfab90-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.200614 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf2b12c-4959-429e-b9db-173f5ddfab90-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.204388 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf2b12c-4959-429e-b9db-173f5ddfab90-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.224367 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2n6k\" (UniqueName: \"kubernetes.io/projected/8cf2b12c-4959-429e-b9db-173f5ddfab90-kube-api-access-m2n6k\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.341545 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.349307 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/76f2644a-8bb9-4719-83dd-429202a52446-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.549283 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" event={"ID":"2db861e2-eceb-4128-a546-8c34cc829276","Type":"ContainerStarted","Data":"c480295e6d9b5f6d633deff6946735d65099dcda59c7c38894b7439a87ba2fe7"} Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.673704 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.677001 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.773608 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-s5wnf" Dec 16 15:12:29 crc kubenswrapper[4728]: E1216 15:12:29.795806 4728 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-config-data: failed to sync configmap cache: timed out waiting for the condition Dec 16 15:12:29 crc kubenswrapper[4728]: E1216 15:12:29.795838 4728 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-config-data: failed to sync configmap cache: timed out waiting for the condition Dec 16 15:12:29 crc kubenswrapper[4728]: E1216 15:12:29.795889 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-config-data-default podName:76f2644a-8bb9-4719-83dd-429202a52446 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:30.295868133 +0000 UTC m=+931.136047117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-config-data-default") pod "openstack-cell1-galera-0" (UID: "76f2644a-8bb9-4719-83dd-429202a52446") : failed to sync configmap cache: timed out waiting for the condition Dec 16 15:12:29 crc kubenswrapper[4728]: E1216 15:12:29.795926 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-kolla-config podName:76f2644a-8bb9-4719-83dd-429202a52446 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:30.295902323 +0000 UTC m=+931.136081317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-kolla-config") pod "openstack-cell1-galera-0" (UID: "76f2644a-8bb9-4719-83dd-429202a52446") : failed to sync configmap cache: timed out waiting for the condition Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.811036 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.818235 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf2b12c-4959-429e-b9db-173f5ddfab90-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:29 crc kubenswrapper[4728]: I1216 15:12:29.895417 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.041102 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-cdqtv" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.079004 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-84twp"] Dec 16 15:12:30 crc kubenswrapper[4728]: E1216 15:12:30.200957 4728 configmap.go:193] Couldn't get configMap openstack/memcached-config-data: failed to sync configmap cache: timed out waiting for the condition Dec 16 15:12:30 crc kubenswrapper[4728]: E1216 15:12:30.201004 4728 configmap.go:193] Couldn't get configMap openstack/memcached-config-data: failed to sync configmap cache: timed out waiting for the condition Dec 16 15:12:30 crc kubenswrapper[4728]: E1216 15:12:30.201034 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-kolla-config podName:8cf2b12c-4959-429e-b9db-173f5ddfab90 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:30.701016176 +0000 UTC m=+931.541195160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-kolla-config") pod "memcached-0" (UID: "8cf2b12c-4959-429e-b9db-173f5ddfab90") : failed to sync configmap cache: timed out waiting for the condition Dec 16 15:12:30 crc kubenswrapper[4728]: E1216 15:12:30.201089 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-config-data podName:8cf2b12c-4959-429e-b9db-173f5ddfab90 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:30.701063958 +0000 UTC m=+931.541242932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-config-data") pod "memcached-0" (UID: "8cf2b12c-4959-429e-b9db-173f5ddfab90") : failed to sync configmap cache: timed out waiting for the condition Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.343714 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.343765 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.344515 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.344690 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/76f2644a-8bb9-4719-83dd-429202a52446-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"76f2644a-8bb9-4719-83dd-429202a52446\") " pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.351206 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.545192 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.546138 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.548281 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wzsv7" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.563767 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.617747 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.648138 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg6ms\" (UniqueName: \"kubernetes.io/projected/897f23b2-ad11-44ed-b0d2-623529b5e559-kube-api-access-gg6ms\") pod \"kube-state-metrics-0\" (UID: \"897f23b2-ad11-44ed-b0d2-623529b5e559\") " pod="openstack/kube-state-metrics-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.749583 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-kolla-config\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.749695 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg6ms\" (UniqueName: \"kubernetes.io/projected/897f23b2-ad11-44ed-b0d2-623529b5e559-kube-api-access-gg6ms\") pod \"kube-state-metrics-0\" (UID: \"897f23b2-ad11-44ed-b0d2-623529b5e559\") " pod="openstack/kube-state-metrics-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.749730 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-config-data\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.750857 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-config-data\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.751276 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8cf2b12c-4959-429e-b9db-173f5ddfab90-kolla-config\") pod \"memcached-0\" (UID: \"8cf2b12c-4959-429e-b9db-173f5ddfab90\") " pod="openstack/memcached-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.769664 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg6ms\" (UniqueName: \"kubernetes.io/projected/897f23b2-ad11-44ed-b0d2-623529b5e559-kube-api-access-gg6ms\") pod \"kube-state-metrics-0\" (UID: \"897f23b2-ad11-44ed-b0d2-623529b5e559\") " pod="openstack/kube-state-metrics-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.780359 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 16 15:12:30 crc kubenswrapper[4728]: I1216 15:12:30.890285 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:12:33 crc kubenswrapper[4728]: E1216 15:12:33.231363 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a939f4_82ee_43e0_8a85_ad8db9e76b64.slice/crio-8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:12:33 crc kubenswrapper[4728]: E1216 15:12:33.475067 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6 is running failed: container process not found" containerID="26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 15:12:33 crc kubenswrapper[4728]: E1216 15:12:33.475628 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6 is running failed: container process not found" containerID="26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 15:12:33 crc kubenswrapper[4728]: E1216 15:12:33.475854 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6 is running failed: container process not found" containerID="26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 15:12:33 crc kubenswrapper[4728]: E1216 15:12:33.475923 4728 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-jh2cv" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerName="registry-server" Dec 16 15:12:33 crc kubenswrapper[4728]: E1216 15:12:33.971036 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2 is running failed: container process not found" containerID="6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 15:12:33 crc kubenswrapper[4728]: E1216 15:12:33.971437 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2 is running failed: container process not found" containerID="6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 15:12:33 crc kubenswrapper[4728]: E1216 15:12:33.971729 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2 is running failed: container process not found" containerID="6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2" cmd=["grpc_health_probe","-addr=:50051"] Dec 16 15:12:33 crc kubenswrapper[4728]: E1216 15:12:33.971763 4728 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-hdgbc" podUID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerName="registry-server" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.367356 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.368479 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.371672 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.371706 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.372295 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-xfknp" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.372586 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.372728 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.392490 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.420695 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2809df7-1873-474c-ab44-14b82f630cb0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.420809 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bq29\" (UniqueName: \"kubernetes.io/projected/b2809df7-1873-474c-ab44-14b82f630cb0-kube-api-access-2bq29\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.420889 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2809df7-1873-474c-ab44-14b82f630cb0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.420920 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2809df7-1873-474c-ab44-14b82f630cb0-config\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.420953 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.421051 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2809df7-1873-474c-ab44-14b82f630cb0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.421097 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2809df7-1873-474c-ab44-14b82f630cb0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.421132 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2809df7-1873-474c-ab44-14b82f630cb0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.522302 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2809df7-1873-474c-ab44-14b82f630cb0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.522338 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2809df7-1873-474c-ab44-14b82f630cb0-config\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.522365 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.522440 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2809df7-1873-474c-ab44-14b82f630cb0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.522468 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2809df7-1873-474c-ab44-14b82f630cb0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.522488 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2809df7-1873-474c-ab44-14b82f630cb0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.522508 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2809df7-1873-474c-ab44-14b82f630cb0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.522542 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bq29\" (UniqueName: \"kubernetes.io/projected/b2809df7-1873-474c-ab44-14b82f630cb0-kube-api-access-2bq29\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.522751 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2809df7-1873-474c-ab44-14b82f630cb0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.523955 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2809df7-1873-474c-ab44-14b82f630cb0-config\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.524632 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2809df7-1873-474c-ab44-14b82f630cb0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.525042 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.528747 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2809df7-1873-474c-ab44-14b82f630cb0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.528748 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2809df7-1873-474c-ab44-14b82f630cb0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.530956 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2809df7-1873-474c-ab44-14b82f630cb0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.541926 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bq29\" (UniqueName: \"kubernetes.io/projected/b2809df7-1873-474c-ab44-14b82f630cb0-kube-api-access-2bq29\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.557133 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b2809df7-1873-474c-ab44-14b82f630cb0\") " pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.704435 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:35 crc kubenswrapper[4728]: W1216 15:12:35.786283 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7136f947_a2e9_45b7_8add_348387eb9645.slice/crio-706b1b6a2f7ecd92048e249374b3ecc6a44faf6c3abc51706819b0ae4f849693 WatchSource:0}: Error finding container 706b1b6a2f7ecd92048e249374b3ecc6a44faf6c3abc51706819b0ae4f849693: Status 404 returned error can't find the container with id 706b1b6a2f7ecd92048e249374b3ecc6a44faf6c3abc51706819b0ae4f849693 Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.789875 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hlkkv"] Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.791135 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.793995 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.794086 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.795794 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-z2p4g" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.832233 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-scripts\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.832304 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-var-run-ovn\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.832354 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l4mv\" (UniqueName: \"kubernetes.io/projected/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-kube-api-access-6l4mv\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.832424 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-var-log-ovn\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.832498 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-var-run\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.832539 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-ovn-controller-tls-certs\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.832580 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-combined-ca-bundle\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.844342 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hlkkv"] Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.862029 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.868873 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-j4m68"] Dec 16 15:12:35 crc kubenswrapper[4728]: E1216 15:12:35.869455 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerName="registry-server" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.869481 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerName="registry-server" Dec 16 15:12:35 crc kubenswrapper[4728]: E1216 15:12:35.869497 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerName="extract-content" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.869505 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerName="extract-content" Dec 16 15:12:35 crc kubenswrapper[4728]: E1216 15:12:35.869519 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerName="extract-utilities" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.869528 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerName="extract-utilities" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.869715 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" containerName="registry-server" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.886246 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.886513 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-j4m68"] Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.886608 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933171 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-catalog-content\") pod \"91e5f218-48b8-47f0-825c-f9eea263b64c\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933269 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrlm5\" (UniqueName: \"kubernetes.io/projected/91e5f218-48b8-47f0-825c-f9eea263b64c-kube-api-access-xrlm5\") pod \"91e5f218-48b8-47f0-825c-f9eea263b64c\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933313 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-utilities\") pod \"91e5f218-48b8-47f0-825c-f9eea263b64c\" (UID: \"91e5f218-48b8-47f0-825c-f9eea263b64c\") " Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933338 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-catalog-content\") pod \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933427 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc6sk\" (UniqueName: \"kubernetes.io/projected/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-kube-api-access-pc6sk\") pod \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933458 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-utilities\") pod \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\" (UID: \"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92\") " Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933619 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-etc-ovs\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933657 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-var-run\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933683 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-combined-ca-bundle\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933716 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l4mv\" (UniqueName: \"kubernetes.io/projected/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-kube-api-access-6l4mv\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933739 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-var-log-ovn\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933759 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/865baf70-58f9-4eee-8cf4-d5e96e6d011e-scripts\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933774 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5z2h\" (UniqueName: \"kubernetes.io/projected/865baf70-58f9-4eee-8cf4-d5e96e6d011e-kube-api-access-s5z2h\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933794 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-var-run\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933810 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-var-lib\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933833 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-ovn-controller-tls-certs\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933857 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-scripts\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933878 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-var-run-ovn\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.933910 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-var-log\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.936344 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-var-log-ovn\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.936871 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-var-run-ovn\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.937010 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-var-run\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.937222 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-utilities" (OuterVolumeSpecName: "utilities") pod "91e5f218-48b8-47f0-825c-f9eea263b64c" (UID: "91e5f218-48b8-47f0-825c-f9eea263b64c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.938027 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-utilities" (OuterVolumeSpecName: "utilities") pod "a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" (UID: "a03b0574-2d3f-49e9-9eb8-bc52c6d13f92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.959504 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" (UID: "a03b0574-2d3f-49e9-9eb8-bc52c6d13f92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.967812 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-scripts\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.974711 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e5f218-48b8-47f0-825c-f9eea263b64c-kube-api-access-xrlm5" (OuterVolumeSpecName: "kube-api-access-xrlm5") pod "91e5f218-48b8-47f0-825c-f9eea263b64c" (UID: "91e5f218-48b8-47f0-825c-f9eea263b64c"). InnerVolumeSpecName "kube-api-access-xrlm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.974811 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-kube-api-access-pc6sk" (OuterVolumeSpecName: "kube-api-access-pc6sk") pod "a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" (UID: "a03b0574-2d3f-49e9-9eb8-bc52c6d13f92"). InnerVolumeSpecName "kube-api-access-pc6sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.975383 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-ovn-controller-tls-certs\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.976333 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-combined-ca-bundle\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:35 crc kubenswrapper[4728]: I1216 15:12:35.979193 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l4mv\" (UniqueName: \"kubernetes.io/projected/37c82b8b-fe2d-4265-80b1-7cdfa00e2be7-kube-api-access-6l4mv\") pod \"ovn-controller-hlkkv\" (UID: \"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7\") " pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035049 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-var-log\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035094 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-etc-ovs\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035159 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/865baf70-58f9-4eee-8cf4-d5e96e6d011e-scripts\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035175 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5z2h\" (UniqueName: \"kubernetes.io/projected/865baf70-58f9-4eee-8cf4-d5e96e6d011e-kube-api-access-s5z2h\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035196 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-var-run\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035213 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-var-lib\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035287 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc6sk\" (UniqueName: \"kubernetes.io/projected/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-kube-api-access-pc6sk\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035299 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035308 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035317 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrlm5\" (UniqueName: \"kubernetes.io/projected/91e5f218-48b8-47f0-825c-f9eea263b64c-kube-api-access-xrlm5\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035329 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035291 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-var-log\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035366 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-var-run\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035623 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-var-lib\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.035679 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/865baf70-58f9-4eee-8cf4-d5e96e6d011e-etc-ovs\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.037494 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/865baf70-58f9-4eee-8cf4-d5e96e6d011e-scripts\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.052999 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5z2h\" (UniqueName: \"kubernetes.io/projected/865baf70-58f9-4eee-8cf4-d5e96e6d011e-kube-api-access-s5z2h\") pod \"ovn-controller-ovs-j4m68\" (UID: \"865baf70-58f9-4eee-8cf4-d5e96e6d011e\") " pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.068154 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91e5f218-48b8-47f0-825c-f9eea263b64c" (UID: "91e5f218-48b8-47f0-825c-f9eea263b64c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.136431 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e5f218-48b8-47f0-825c-f9eea263b64c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.185725 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.203175 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.610610 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jh2cv" event={"ID":"91e5f218-48b8-47f0-825c-f9eea263b64c","Type":"ContainerDied","Data":"76b3d3c71bf47175c17d5aa259ba2e89d946431ffa72506ee8e91155eab412ce"} Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.610656 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jh2cv" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.610666 4728 scope.go:117] "RemoveContainer" containerID="26f8f124a84c5ed9d92c5fdaccba3f2a6f78fc8679613a7460e0261ec7b4dfc6" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.613699 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hdgbc" event={"ID":"a03b0574-2d3f-49e9-9eb8-bc52c6d13f92","Type":"ContainerDied","Data":"48bf6c4ed8fbd8f95dd5bd5175a95db3efd8e1b89ac797a62e6a27f01c7c4268"} Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.613727 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hdgbc" Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.614925 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" event={"ID":"7136f947-a2e9-45b7-8add-348387eb9645","Type":"ContainerStarted","Data":"706b1b6a2f7ecd92048e249374b3ecc6a44faf6c3abc51706819b0ae4f849693"} Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.647799 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jh2cv"] Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.665455 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jh2cv"] Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.672795 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgbc"] Dec 16 15:12:36 crc kubenswrapper[4728]: I1216 15:12:36.678345 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hdgbc"] Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.519101 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" path="/var/lib/kubelet/pods/91e5f218-48b8-47f0-825c-f9eea263b64c/volumes" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.519932 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a03b0574-2d3f-49e9-9eb8-bc52c6d13f92" path="/var/lib/kubelet/pods/a03b0574-2d3f-49e9-9eb8-bc52c6d13f92/volumes" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.536740 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 15:12:37 crc kubenswrapper[4728]: E1216 15:12:37.537089 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerName="extract-utilities" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.537107 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerName="extract-utilities" Dec 16 15:12:37 crc kubenswrapper[4728]: E1216 15:12:37.537156 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerName="extract-content" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.537163 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerName="extract-content" Dec 16 15:12:37 crc kubenswrapper[4728]: E1216 15:12:37.537173 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerName="registry-server" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.537179 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerName="registry-server" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.537353 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e5f218-48b8-47f0-825c-f9eea263b64c" containerName="registry-server" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.538235 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.540455 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.540765 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.540950 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-v6t5f" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.542734 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.553533 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.568876 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.568944 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.568967 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.568986 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.569047 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-config\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.569082 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.569111 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.569139 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65sz6\" (UniqueName: \"kubernetes.io/projected/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-kube-api-access-65sz6\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.671084 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-config\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.671168 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.671221 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.671265 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65sz6\" (UniqueName: \"kubernetes.io/projected/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-kube-api-access-65sz6\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.671316 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.671362 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.671393 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.671502 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.671770 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.672167 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-config\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.672213 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.675831 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.676106 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.685119 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.689645 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.692570 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65sz6\" (UniqueName: \"kubernetes.io/projected/d587bd5e-c0c9-48f1-a2b6-616e904ceed3-kube-api-access-65sz6\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.692830 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d587bd5e-c0c9-48f1-a2b6-616e904ceed3\") " pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:37 crc kubenswrapper[4728]: I1216 15:12:37.906014 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.425571 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4kc6d"] Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.432979 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.451267 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4kc6d"] Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.540539 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-utilities\") pod \"community-operators-4kc6d\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.540595 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-catalog-content\") pod \"community-operators-4kc6d\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.540630 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr2gk\" (UniqueName: \"kubernetes.io/projected/3d759943-69c0-4ea1-b8cf-93060971988c-kube-api-access-fr2gk\") pod \"community-operators-4kc6d\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.641880 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-utilities\") pod \"community-operators-4kc6d\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.641926 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-catalog-content\") pod \"community-operators-4kc6d\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.641974 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr2gk\" (UniqueName: \"kubernetes.io/projected/3d759943-69c0-4ea1-b8cf-93060971988c-kube-api-access-fr2gk\") pod \"community-operators-4kc6d\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.642544 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-utilities\") pod \"community-operators-4kc6d\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.642587 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-catalog-content\") pod \"community-operators-4kc6d\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.661182 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr2gk\" (UniqueName: \"kubernetes.io/projected/3d759943-69c0-4ea1-b8cf-93060971988c-kube-api-access-fr2gk\") pod \"community-operators-4kc6d\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:12:41 crc kubenswrapper[4728]: I1216 15:12:41.863653 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:12:42 crc kubenswrapper[4728]: I1216 15:12:42.569587 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:12:43 crc kubenswrapper[4728]: E1216 15:12:43.103100 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 16 15:12:43 crc kubenswrapper[4728]: E1216 15:12:43.103298 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9nkpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4qj54_openstack(33ac9add-559e-4408-86e6-39ecc8732b0c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:43 crc kubenswrapper[4728]: E1216 15:12:43.104587 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" podUID="33ac9add-559e-4408-86e6-39ecc8732b0c" Dec 16 15:12:43 crc kubenswrapper[4728]: E1216 15:12:43.123975 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 16 15:12:43 crc kubenswrapper[4728]: E1216 15:12:43.124205 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4tp8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-glhkl_openstack(c465bc05-dd37-47cb-9a67-d5889552e647): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:12:43 crc kubenswrapper[4728]: E1216 15:12:43.125609 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" podUID="c465bc05-dd37-47cb-9a67-d5889552e647" Dec 16 15:12:43 crc kubenswrapper[4728]: I1216 15:12:43.126353 4728 scope.go:117] "RemoveContainer" containerID="7d09660bad10ffad1a6739e5a428f0ca506033f5eb5ee6c8b81678bc28fa2f36" Dec 16 15:12:43 crc kubenswrapper[4728]: I1216 15:12:43.278236 4728 scope.go:117] "RemoveContainer" containerID="82d0513e5b16abca1fed5d6bf9b535a054b809c4ed37bdc8952518b055efe58d" Dec 16 15:12:43 crc kubenswrapper[4728]: I1216 15:12:43.323964 4728 scope.go:117] "RemoveContainer" containerID="6ba6726f9b2de7a2a5f381b7e7c29ed0f456f03219e5846cb0dd12f559648ae2" Dec 16 15:12:43 crc kubenswrapper[4728]: I1216 15:12:43.370135 4728 scope.go:117] "RemoveContainer" containerID="d3d99354a7acf56b51b53f5caedcfed42c28cb02f9bb765a9b188a73f155ab88" Dec 16 15:12:43 crc kubenswrapper[4728]: I1216 15:12:43.425997 4728 scope.go:117] "RemoveContainer" containerID="6e4dcc79a678f81de7a054b264ab83080b66de384dee9fddb1d64aea61cea70f" Dec 16 15:12:43 crc kubenswrapper[4728]: E1216 15:12:43.582529 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a939f4_82ee_43e0_8a85_ad8db9e76b64.slice/crio-8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:12:43 crc kubenswrapper[4728]: I1216 15:12:43.665678 4728 generic.go:334] "Generic (PLEG): container finished" podID="7136f947-a2e9-45b7-8add-348387eb9645" containerID="2b319ed4a1932c7dcf5d82f4deac4df826da345b25a46f529b3f9d97664a31cd" exitCode=0 Dec 16 15:12:43 crc kubenswrapper[4728]: I1216 15:12:43.665763 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" event={"ID":"7136f947-a2e9-45b7-8add-348387eb9645","Type":"ContainerDied","Data":"2b319ed4a1932c7dcf5d82f4deac4df826da345b25a46f529b3f9d97664a31cd"} Dec 16 15:12:43 crc kubenswrapper[4728]: I1216 15:12:43.675039 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b12213-b2ec-4fa5-b848-d06fe7855247","Type":"ContainerStarted","Data":"ea98c649e14fe396963ed064767296630c30364215dc40884e12396c98ee0c43"} Dec 16 15:12:43 crc kubenswrapper[4728]: I1216 15:12:43.678941 4728 generic.go:334] "Generic (PLEG): container finished" podID="2db861e2-eceb-4128-a546-8c34cc829276" containerID="6fd4884b3c6e09769ee17288ab4c2b53688a0ef4e041efdfc25b2deb1b71ff8b" exitCode=0 Dec 16 15:12:43 crc kubenswrapper[4728]: I1216 15:12:43.679041 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" event={"ID":"2db861e2-eceb-4128-a546-8c34cc829276","Type":"ContainerDied","Data":"6fd4884b3c6e09769ee17288ab4c2b53688a0ef4e041efdfc25b2deb1b71ff8b"} Dec 16 15:12:43 crc kubenswrapper[4728]: I1216 15:12:43.758799 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.014165 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.034853 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.057488 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.071733 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hlkkv"] Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.090841 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 16 15:12:44 crc kubenswrapper[4728]: W1216 15:12:44.113720 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb629e93_c552_47c3_8c89_11254ffa834f.slice/crio-52c76fe3b8ff7db2754b85bcc25d8e4edecdd2666515955233004b7b26e120b6 WatchSource:0}: Error finding container 52c76fe3b8ff7db2754b85bcc25d8e4edecdd2666515955233004b7b26e120b6: Status 404 returned error can't find the container with id 52c76fe3b8ff7db2754b85bcc25d8e4edecdd2666515955233004b7b26e120b6 Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.191176 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tp8v\" (UniqueName: \"kubernetes.io/projected/c465bc05-dd37-47cb-9a67-d5889552e647-kube-api-access-4tp8v\") pod \"c465bc05-dd37-47cb-9a67-d5889552e647\" (UID: \"c465bc05-dd37-47cb-9a67-d5889552e647\") " Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.191375 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nkpt\" (UniqueName: \"kubernetes.io/projected/33ac9add-559e-4408-86e6-39ecc8732b0c-kube-api-access-9nkpt\") pod \"33ac9add-559e-4408-86e6-39ecc8732b0c\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.191418 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-dns-svc\") pod \"33ac9add-559e-4408-86e6-39ecc8732b0c\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.191468 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c465bc05-dd37-47cb-9a67-d5889552e647-config\") pod \"c465bc05-dd37-47cb-9a67-d5889552e647\" (UID: \"c465bc05-dd37-47cb-9a67-d5889552e647\") " Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.191524 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-config\") pod \"33ac9add-559e-4408-86e6-39ecc8732b0c\" (UID: \"33ac9add-559e-4408-86e6-39ecc8732b0c\") " Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.192223 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33ac9add-559e-4408-86e6-39ecc8732b0c" (UID: "33ac9add-559e-4408-86e6-39ecc8732b0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.192243 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-config" (OuterVolumeSpecName: "config") pod "33ac9add-559e-4408-86e6-39ecc8732b0c" (UID: "33ac9add-559e-4408-86e6-39ecc8732b0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.192255 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c465bc05-dd37-47cb-9a67-d5889552e647-config" (OuterVolumeSpecName: "config") pod "c465bc05-dd37-47cb-9a67-d5889552e647" (UID: "c465bc05-dd37-47cb-9a67-d5889552e647"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.197055 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ac9add-559e-4408-86e6-39ecc8732b0c-kube-api-access-9nkpt" (OuterVolumeSpecName: "kube-api-access-9nkpt") pod "33ac9add-559e-4408-86e6-39ecc8732b0c" (UID: "33ac9add-559e-4408-86e6-39ecc8732b0c"). InnerVolumeSpecName "kube-api-access-9nkpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.197173 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c465bc05-dd37-47cb-9a67-d5889552e647-kube-api-access-4tp8v" (OuterVolumeSpecName: "kube-api-access-4tp8v") pod "c465bc05-dd37-47cb-9a67-d5889552e647" (UID: "c465bc05-dd37-47cb-9a67-d5889552e647"). InnerVolumeSpecName "kube-api-access-4tp8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.202176 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 16 15:12:44 crc kubenswrapper[4728]: W1216 15:12:44.204650 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2809df7_1873_474c_ab44_14b82f630cb0.slice/crio-03239e93a60b8f2cda089fb7802d90e0db58da37113baee0532343544f489360 WatchSource:0}: Error finding container 03239e93a60b8f2cda089fb7802d90e0db58da37113baee0532343544f489360: Status 404 returned error can't find the container with id 03239e93a60b8f2cda089fb7802d90e0db58da37113baee0532343544f489360 Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.247526 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:12:44 crc kubenswrapper[4728]: W1216 15:12:44.253748 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod897f23b2_ad11_44ed_b0d2_623529b5e559.slice/crio-6e898a7cef57539d74dedeffe1ddec0fc9548245e611bb4f2dead9058509a8f3 WatchSource:0}: Error finding container 6e898a7cef57539d74dedeffe1ddec0fc9548245e611bb4f2dead9058509a8f3: Status 404 returned error can't find the container with id 6e898a7cef57539d74dedeffe1ddec0fc9548245e611bb4f2dead9058509a8f3 Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.257659 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.292055 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-j4m68"] Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.292923 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nkpt\" (UniqueName: \"kubernetes.io/projected/33ac9add-559e-4408-86e6-39ecc8732b0c-kube-api-access-9nkpt\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.292944 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.292953 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c465bc05-dd37-47cb-9a67-d5889552e647-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.292961 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ac9add-559e-4408-86e6-39ecc8732b0c-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.292968 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tp8v\" (UniqueName: \"kubernetes.io/projected/c465bc05-dd37-47cb-9a67-d5889552e647-kube-api-access-4tp8v\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:44 crc kubenswrapper[4728]: W1216 15:12:44.312068 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod865baf70_58f9_4eee_8cf4_d5e96e6d011e.slice/crio-3b27fc71da4d9aa32ab88c2cf671c09d6c42ccc4af03117a98e12fb783992b87 WatchSource:0}: Error finding container 3b27fc71da4d9aa32ab88c2cf671c09d6c42ccc4af03117a98e12fb783992b87: Status 404 returned error can't find the container with id 3b27fc71da4d9aa32ab88c2cf671c09d6c42ccc4af03117a98e12fb783992b87 Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.337910 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4kc6d"] Dec 16 15:12:44 crc kubenswrapper[4728]: W1216 15:12:44.342687 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d759943_69c0_4ea1_b8cf_93060971988c.slice/crio-fdecd74d1f9f5a52f25aa30f402179f6191bdc610876a45f9fe2442229346741 WatchSource:0}: Error finding container fdecd74d1f9f5a52f25aa30f402179f6191bdc610876a45f9fe2442229346741: Status 404 returned error can't find the container with id fdecd74d1f9f5a52f25aa30f402179f6191bdc610876a45f9fe2442229346741 Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.405053 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.689246 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d587bd5e-c0c9-48f1-a2b6-616e904ceed3","Type":"ContainerStarted","Data":"d053c97851532034586c00d83bfa0aeaed797ec4facb7318870b904fdac895f5"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.691320 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"76f2644a-8bb9-4719-83dd-429202a52446","Type":"ContainerStarted","Data":"f0639e6f002ba53d8de1fca51a9ff44414c9a41a2ac5d2261c651922eed15195"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.693338 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"897f23b2-ad11-44ed-b0d2-623529b5e559","Type":"ContainerStarted","Data":"6e898a7cef57539d74dedeffe1ddec0fc9548245e611bb4f2dead9058509a8f3"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.694951 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j4m68" event={"ID":"865baf70-58f9-4eee-8cf4-d5e96e6d011e","Type":"ContainerStarted","Data":"3b27fc71da4d9aa32ab88c2cf671c09d6c42ccc4af03117a98e12fb783992b87"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.696870 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eb629e93-c552-47c3-8c89-11254ffa834f","Type":"ContainerStarted","Data":"52c76fe3b8ff7db2754b85bcc25d8e4edecdd2666515955233004b7b26e120b6"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.698135 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" event={"ID":"33ac9add-559e-4408-86e6-39ecc8732b0c","Type":"ContainerDied","Data":"e04b0586cdf43e5393c8aafb6d1ef4edc5474c7ac835645d593af2c7f1ecfa92"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.698160 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4qj54" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.699484 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b2809df7-1873-474c-ab44-14b82f630cb0","Type":"ContainerStarted","Data":"03239e93a60b8f2cda089fb7802d90e0db58da37113baee0532343544f489360"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.701727 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" event={"ID":"c465bc05-dd37-47cb-9a67-d5889552e647","Type":"ContainerDied","Data":"af591bc06fc2d87860afa62d84d5cb42e6083338e8500954bec9aeaebf988069"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.701761 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-glhkl" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.703250 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlkkv" event={"ID":"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7","Type":"ContainerStarted","Data":"b652b95680aef8db5b8f1c52c6262bfd8b8ac5999b370fddd846142470e98788"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.706945 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" event={"ID":"2db861e2-eceb-4128-a546-8c34cc829276","Type":"ContainerStarted","Data":"7a381d766b0fc86fd27fb3b714fb2ba87e1dcbe75bde3162cacdb6ecef72e8c6"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.707070 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.713142 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" event={"ID":"7136f947-a2e9-45b7-8add-348387eb9645","Type":"ContainerStarted","Data":"a1c8a3d794837fc0893d24596026b22bff65299ec3e12fcfa6c0a9feb8ed3a1b"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.713271 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.714155 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31e565e7-a84a-436e-bc5d-dc107a42ef0f","Type":"ContainerStarted","Data":"eb936f4744cb9c19394b2cb4f41d59227a74fd9327a19cdbb5053c25cd272ee5"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.715799 4728 generic.go:334] "Generic (PLEG): container finished" podID="3d759943-69c0-4ea1-b8cf-93060971988c" containerID="5d54a14280ffcd49a5c68a799bd99a4e93ad5c301edce2bbbf8243f820cf573f" exitCode=0 Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.716393 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kc6d" event={"ID":"3d759943-69c0-4ea1-b8cf-93060971988c","Type":"ContainerDied","Data":"5d54a14280ffcd49a5c68a799bd99a4e93ad5c301edce2bbbf8243f820cf573f"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.716434 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kc6d" event={"ID":"3d759943-69c0-4ea1-b8cf-93060971988c","Type":"ContainerStarted","Data":"fdecd74d1f9f5a52f25aa30f402179f6191bdc610876a45f9fe2442229346741"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.722833 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8cf2b12c-4959-429e-b9db-173f5ddfab90","Type":"ContainerStarted","Data":"358198128ed1f0fc1a73011ebd71e960fc1510bd9fec2f0540a9458556445ff7"} Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.748464 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" podStartSLOduration=6.799784066 podStartE2EDuration="20.748446103s" podCreationTimestamp="2025-12-16 15:12:24 +0000 UTC" firstStartedPulling="2025-12-16 15:12:29.382382038 +0000 UTC m=+930.222561022" lastFinishedPulling="2025-12-16 15:12:43.331044075 +0000 UTC m=+944.171223059" observedRunningTime="2025-12-16 15:12:44.740654729 +0000 UTC m=+945.580833713" watchObservedRunningTime="2025-12-16 15:12:44.748446103 +0000 UTC m=+945.588625087" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.841763 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4qj54"] Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.852668 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4qj54"] Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.859783 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" podStartSLOduration=13.34733117 podStartE2EDuration="20.859747025s" podCreationTimestamp="2025-12-16 15:12:24 +0000 UTC" firstStartedPulling="2025-12-16 15:12:35.788465101 +0000 UTC m=+936.628644085" lastFinishedPulling="2025-12-16 15:12:43.300880956 +0000 UTC m=+944.141059940" observedRunningTime="2025-12-16 15:12:44.835063245 +0000 UTC m=+945.675242249" watchObservedRunningTime="2025-12-16 15:12:44.859747025 +0000 UTC m=+945.699925999" Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.894475 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-glhkl"] Dec 16 15:12:44 crc kubenswrapper[4728]: I1216 15:12:44.899362 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-glhkl"] Dec 16 15:12:45 crc kubenswrapper[4728]: I1216 15:12:45.533177 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ac9add-559e-4408-86e6-39ecc8732b0c" path="/var/lib/kubelet/pods/33ac9add-559e-4408-86e6-39ecc8732b0c/volumes" Dec 16 15:12:45 crc kubenswrapper[4728]: I1216 15:12:45.533534 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c465bc05-dd37-47cb-9a67-d5889552e647" path="/var/lib/kubelet/pods/c465bc05-dd37-47cb-9a67-d5889552e647/volumes" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.527800 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ccc6t"] Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.529731 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.533621 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.534009 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ccc6t"] Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.550000 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.639530 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-84twp"] Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.640352 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" podUID="7136f947-a2e9-45b7-8add-348387eb9645" containerName="dnsmasq-dns" containerID="cri-o://a1c8a3d794837fc0893d24596026b22bff65299ec3e12fcfa6c0a9feb8ed3a1b" gracePeriod=10 Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.641559 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.679836 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-2x88b"] Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.682208 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.688764 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.698249 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-2x88b"] Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.698515 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/effa7d99-cccc-431b-91b6-d4302f7dce22-ovn-rundir\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.698552 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effa7d99-cccc-431b-91b6-d4302f7dce22-combined-ca-bundle\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.698575 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/effa7d99-cccc-431b-91b6-d4302f7dce22-ovs-rundir\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.698604 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effa7d99-cccc-431b-91b6-d4302f7dce22-config\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.698659 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlbc7\" (UniqueName: \"kubernetes.io/projected/effa7d99-cccc-431b-91b6-d4302f7dce22-kube-api-access-vlbc7\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.698684 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/effa7d99-cccc-431b-91b6-d4302f7dce22-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.800867 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.800962 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/effa7d99-cccc-431b-91b6-d4302f7dce22-ovn-rundir\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.801040 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effa7d99-cccc-431b-91b6-d4302f7dce22-combined-ca-bundle\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.801104 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/effa7d99-cccc-431b-91b6-d4302f7dce22-ovs-rundir\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.801134 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpxqt\" (UniqueName: \"kubernetes.io/projected/d63a76e8-4889-4a91-b18a-78229e2818a1-kube-api-access-gpxqt\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.801206 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effa7d99-cccc-431b-91b6-d4302f7dce22-config\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.801296 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlbc7\" (UniqueName: \"kubernetes.io/projected/effa7d99-cccc-431b-91b6-d4302f7dce22-kube-api-access-vlbc7\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.801332 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/effa7d99-cccc-431b-91b6-d4302f7dce22-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.801355 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.801478 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-config\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.801765 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/effa7d99-cccc-431b-91b6-d4302f7dce22-ovs-rundir\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.801849 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/effa7d99-cccc-431b-91b6-d4302f7dce22-ovn-rundir\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.802556 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effa7d99-cccc-431b-91b6-d4302f7dce22-config\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.837471 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effa7d99-cccc-431b-91b6-d4302f7dce22-combined-ca-bundle\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.860521 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" podUID="7136f947-a2e9-45b7-8add-348387eb9645" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.96:5353: connect: connection refused" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.860773 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/effa7d99-cccc-431b-91b6-d4302f7dce22-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.867963 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlbc7\" (UniqueName: \"kubernetes.io/projected/effa7d99-cccc-431b-91b6-d4302f7dce22-kube-api-access-vlbc7\") pod \"ovn-controller-metrics-ccc6t\" (UID: \"effa7d99-cccc-431b-91b6-d4302f7dce22\") " pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.902641 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.902719 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-config\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.902762 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.902800 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpxqt\" (UniqueName: \"kubernetes.io/projected/d63a76e8-4889-4a91-b18a-78229e2818a1-kube-api-access-gpxqt\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.903565 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.903716 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-config\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.904058 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.945161 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpxqt\" (UniqueName: \"kubernetes.io/projected/d63a76e8-4889-4a91-b18a-78229e2818a1-kube-api-access-gpxqt\") pod \"dnsmasq-dns-7fd796d7df-2x88b\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.974459 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-2x88b"] Dec 16 15:12:49 crc kubenswrapper[4728]: I1216 15:12:49.974964 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.026240 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w49k5"] Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.047985 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.060971 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w49k5"] Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.067656 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.152736 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ccc6t" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.211293 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.211352 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w7w2\" (UniqueName: \"kubernetes.io/projected/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-kube-api-access-2w7w2\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.211397 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-config\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.211500 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.211642 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.313059 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-config\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.313177 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.313199 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.313281 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.313309 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w7w2\" (UniqueName: \"kubernetes.io/projected/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-kube-api-access-2w7w2\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.314044 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-config\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.314316 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.314814 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.315103 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.349306 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w7w2\" (UniqueName: \"kubernetes.io/projected/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-kube-api-access-2w7w2\") pod \"dnsmasq-dns-86db49b7ff-w49k5\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.369880 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.772350 4728 generic.go:334] "Generic (PLEG): container finished" podID="7136f947-a2e9-45b7-8add-348387eb9645" containerID="a1c8a3d794837fc0893d24596026b22bff65299ec3e12fcfa6c0a9feb8ed3a1b" exitCode=0 Dec 16 15:12:50 crc kubenswrapper[4728]: I1216 15:12:50.772462 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" event={"ID":"7136f947-a2e9-45b7-8add-348387eb9645","Type":"ContainerDied","Data":"a1c8a3d794837fc0893d24596026b22bff65299ec3e12fcfa6c0a9feb8ed3a1b"} Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.699662 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.794972 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-dns-svc\") pod \"7136f947-a2e9-45b7-8add-348387eb9645\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.795142 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-config\") pod \"7136f947-a2e9-45b7-8add-348387eb9645\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.795174 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2wjl\" (UniqueName: \"kubernetes.io/projected/7136f947-a2e9-45b7-8add-348387eb9645-kube-api-access-n2wjl\") pod \"7136f947-a2e9-45b7-8add-348387eb9645\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.797389 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" event={"ID":"7136f947-a2e9-45b7-8add-348387eb9645","Type":"ContainerDied","Data":"706b1b6a2f7ecd92048e249374b3ecc6a44faf6c3abc51706819b0ae4f849693"} Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.797446 4728 scope.go:117] "RemoveContainer" containerID="a1c8a3d794837fc0893d24596026b22bff65299ec3e12fcfa6c0a9feb8ed3a1b" Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.797460 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-84twp" Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.798557 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7136f947-a2e9-45b7-8add-348387eb9645-kube-api-access-n2wjl" (OuterVolumeSpecName: "kube-api-access-n2wjl") pod "7136f947-a2e9-45b7-8add-348387eb9645" (UID: "7136f947-a2e9-45b7-8add-348387eb9645"). InnerVolumeSpecName "kube-api-access-n2wjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:12:52 crc kubenswrapper[4728]: E1216 15:12:52.869428 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-config podName:7136f947-a2e9-45b7-8add-348387eb9645 nodeName:}" failed. No retries permitted until 2025-12-16 15:12:53.369384766 +0000 UTC m=+954.209563750 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-config") pod "7136f947-a2e9-45b7-8add-348387eb9645" (UID: "7136f947-a2e9-45b7-8add-348387eb9645") : error deleting /var/lib/kubelet/pods/7136f947-a2e9-45b7-8add-348387eb9645/volume-subpaths: remove /var/lib/kubelet/pods/7136f947-a2e9-45b7-8add-348387eb9645/volume-subpaths: no such file or directory Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.869775 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7136f947-a2e9-45b7-8add-348387eb9645" (UID: "7136f947-a2e9-45b7-8add-348387eb9645"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.896712 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2wjl\" (UniqueName: \"kubernetes.io/projected/7136f947-a2e9-45b7-8add-348387eb9645-kube-api-access-n2wjl\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.896945 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:52 crc kubenswrapper[4728]: I1216 15:12:52.983487 4728 scope.go:117] "RemoveContainer" containerID="2b319ed4a1932c7dcf5d82f4deac4df826da345b25a46f529b3f9d97664a31cd" Dec 16 15:12:53 crc kubenswrapper[4728]: I1216 15:12:53.214444 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ccc6t"] Dec 16 15:12:53 crc kubenswrapper[4728]: I1216 15:12:53.357885 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-2x88b"] Dec 16 15:12:53 crc kubenswrapper[4728]: I1216 15:12:53.367421 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w49k5"] Dec 16 15:12:53 crc kubenswrapper[4728]: I1216 15:12:53.406366 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-config\") pod \"7136f947-a2e9-45b7-8add-348387eb9645\" (UID: \"7136f947-a2e9-45b7-8add-348387eb9645\") " Dec 16 15:12:53 crc kubenswrapper[4728]: I1216 15:12:53.407662 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-config" (OuterVolumeSpecName: "config") pod "7136f947-a2e9-45b7-8add-348387eb9645" (UID: "7136f947-a2e9-45b7-8add-348387eb9645"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:12:53 crc kubenswrapper[4728]: I1216 15:12:53.507432 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7136f947-a2e9-45b7-8add-348387eb9645-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:53 crc kubenswrapper[4728]: I1216 15:12:53.740366 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-84twp"] Dec 16 15:12:53 crc kubenswrapper[4728]: I1216 15:12:53.760946 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-84twp"] Dec 16 15:12:53 crc kubenswrapper[4728]: E1216 15:12:53.789040 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a939f4_82ee_43e0_8a85_ad8db9e76b64.slice/crio-8e99ea0713bd71fd7a0e2fb45766109f00e51c322dc9e9fd3e50bb3b62d43017.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:12:53 crc kubenswrapper[4728]: I1216 15:12:53.805613 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ccc6t" event={"ID":"effa7d99-cccc-431b-91b6-d4302f7dce22","Type":"ContainerStarted","Data":"d1c3affab175e54737eebf9a27ac9e79315be975f2f1df980d7a726606fb1539"} Dec 16 15:12:53 crc kubenswrapper[4728]: I1216 15:12:53.806729 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" event={"ID":"d63a76e8-4889-4a91-b18a-78229e2818a1","Type":"ContainerStarted","Data":"d37a458c6be207117db2a3be28490466ea704a7eca8f35bc2eaab4d333aa857c"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.819257 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"897f23b2-ad11-44ed-b0d2-623529b5e559","Type":"ContainerStarted","Data":"b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.819844 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.823995 4728 generic.go:334] "Generic (PLEG): container finished" podID="d63a76e8-4889-4a91-b18a-78229e2818a1" containerID="96334611c6e190bacbd972c0880220372bebf5ee55258ed1d84180c6305af30e" exitCode=0 Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.824199 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" event={"ID":"d63a76e8-4889-4a91-b18a-78229e2818a1","Type":"ContainerDied","Data":"96334611c6e190bacbd972c0880220372bebf5ee55258ed1d84180c6305af30e"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.826450 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j4m68" event={"ID":"865baf70-58f9-4eee-8cf4-d5e96e6d011e","Type":"ContainerStarted","Data":"15a4243f6731fd3dc1f339e745a0a5b4b0c8cf919fa1cca4e9c2c9e856bd624b"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.836759 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eb629e93-c552-47c3-8c89-11254ffa834f","Type":"ContainerStarted","Data":"1ff0e07080d243b5a0af098e36e82b83ffe1af86db0405eea4070569a215a212"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.841138 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.199244183 podStartE2EDuration="24.841122373s" podCreationTimestamp="2025-12-16 15:12:30 +0000 UTC" firstStartedPulling="2025-12-16 15:12:44.263956456 +0000 UTC m=+945.104135440" lastFinishedPulling="2025-12-16 15:12:53.905834636 +0000 UTC m=+954.746013630" observedRunningTime="2025-12-16 15:12:54.837883304 +0000 UTC m=+955.678062298" watchObservedRunningTime="2025-12-16 15:12:54.841122373 +0000 UTC m=+955.681301357" Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.843534 4728 generic.go:334] "Generic (PLEG): container finished" podID="3d759943-69c0-4ea1-b8cf-93060971988c" containerID="a61cb109dc16a8300f7085f639a21f81606ba222410495e1effb08f1539e5353" exitCode=0 Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.843637 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kc6d" event={"ID":"3d759943-69c0-4ea1-b8cf-93060971988c","Type":"ContainerDied","Data":"a61cb109dc16a8300f7085f639a21f81606ba222410495e1effb08f1539e5353"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.850343 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"76f2644a-8bb9-4719-83dd-429202a52446","Type":"ContainerStarted","Data":"b97aba4674722ce4e75420a5f252acc930a6c9514c184ad8cbda15cdc9b742fc"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.853654 4728 generic.go:334] "Generic (PLEG): container finished" podID="0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" containerID="e8381635a9b7c89dcbd4c4fec489c9df0bd667c53acb3e459bfe4f77f265d20a" exitCode=0 Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.853752 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" event={"ID":"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b","Type":"ContainerDied","Data":"e8381635a9b7c89dcbd4c4fec489c9df0bd667c53acb3e459bfe4f77f265d20a"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.853795 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" event={"ID":"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b","Type":"ContainerStarted","Data":"f8fca9500e055c140505ad07b29285f5eccadcca18fbbb8dea0bb5511b14753e"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.863526 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlkkv" event={"ID":"37c82b8b-fe2d-4265-80b1-7cdfa00e2be7","Type":"ContainerStarted","Data":"ecd44ba131aee50610ca308823b930ba5c0a63d5c07a0524216aa0b3aefe2dc8"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.863591 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-hlkkv" Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.877007 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b2809df7-1873-474c-ab44-14b82f630cb0","Type":"ContainerStarted","Data":"b591c5891de3f8032ecefd6f532a836877b2f02e751198b12c3953c6bf26dde2"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.890542 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d587bd5e-c0c9-48f1-a2b6-616e904ceed3","Type":"ContainerStarted","Data":"6b11d883322bdbea227af238550b72c62b26d19c22af58384533f34da4d3f51f"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.905799 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8cf2b12c-4959-429e-b9db-173f5ddfab90","Type":"ContainerStarted","Data":"18706562ba7a8436643c11f8977b17d454cf3324334841729cf68b4ad4f025b9"} Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.906080 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 16 15:12:54 crc kubenswrapper[4728]: I1216 15:12:54.951362 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hlkkv" podStartSLOduration=11.231224009 podStartE2EDuration="19.951340564s" podCreationTimestamp="2025-12-16 15:12:35 +0000 UTC" firstStartedPulling="2025-12-16 15:12:44.131526783 +0000 UTC m=+944.971705767" lastFinishedPulling="2025-12-16 15:12:52.851643328 +0000 UTC m=+953.691822322" observedRunningTime="2025-12-16 15:12:54.947056597 +0000 UTC m=+955.787235581" watchObservedRunningTime="2025-12-16 15:12:54.951340564 +0000 UTC m=+955.791519548" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.032687 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.684623981 podStartE2EDuration="27.032673841s" podCreationTimestamp="2025-12-16 15:12:28 +0000 UTC" firstStartedPulling="2025-12-16 15:12:44.273937471 +0000 UTC m=+945.114116455" lastFinishedPulling="2025-12-16 15:12:52.621987331 +0000 UTC m=+953.462166315" observedRunningTime="2025-12-16 15:12:55.029919876 +0000 UTC m=+955.870098860" watchObservedRunningTime="2025-12-16 15:12:55.032673841 +0000 UTC m=+955.872852825" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.169695 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.244503 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-dns-svc\") pod \"d63a76e8-4889-4a91-b18a-78229e2818a1\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.244582 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpxqt\" (UniqueName: \"kubernetes.io/projected/d63a76e8-4889-4a91-b18a-78229e2818a1-kube-api-access-gpxqt\") pod \"d63a76e8-4889-4a91-b18a-78229e2818a1\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.244617 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-ovsdbserver-nb\") pod \"d63a76e8-4889-4a91-b18a-78229e2818a1\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.244678 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-config\") pod \"d63a76e8-4889-4a91-b18a-78229e2818a1\" (UID: \"d63a76e8-4889-4a91-b18a-78229e2818a1\") " Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.248901 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63a76e8-4889-4a91-b18a-78229e2818a1-kube-api-access-gpxqt" (OuterVolumeSpecName: "kube-api-access-gpxqt") pod "d63a76e8-4889-4a91-b18a-78229e2818a1" (UID: "d63a76e8-4889-4a91-b18a-78229e2818a1"). InnerVolumeSpecName "kube-api-access-gpxqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.263279 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d63a76e8-4889-4a91-b18a-78229e2818a1" (UID: "d63a76e8-4889-4a91-b18a-78229e2818a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.265089 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-config" (OuterVolumeSpecName: "config") pod "d63a76e8-4889-4a91-b18a-78229e2818a1" (UID: "d63a76e8-4889-4a91-b18a-78229e2818a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.265897 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d63a76e8-4889-4a91-b18a-78229e2818a1" (UID: "d63a76e8-4889-4a91-b18a-78229e2818a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.346622 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.346654 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpxqt\" (UniqueName: \"kubernetes.io/projected/d63a76e8-4889-4a91-b18a-78229e2818a1-kube-api-access-gpxqt\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.346670 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.346681 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63a76e8-4889-4a91-b18a-78229e2818a1-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.521021 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7136f947-a2e9-45b7-8add-348387eb9645" path="/var/lib/kubelet/pods/7136f947-a2e9-45b7-8add-348387eb9645/volumes" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.932661 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" event={"ID":"d63a76e8-4889-4a91-b18a-78229e2818a1","Type":"ContainerDied","Data":"d37a458c6be207117db2a3be28490466ea704a7eca8f35bc2eaab4d333aa857c"} Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.932715 4728 scope.go:117] "RemoveContainer" containerID="96334611c6e190bacbd972c0880220372bebf5ee55258ed1d84180c6305af30e" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.932844 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-2x88b" Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.937078 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31e565e7-a84a-436e-bc5d-dc107a42ef0f","Type":"ContainerStarted","Data":"25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5"} Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.940835 4728 generic.go:334] "Generic (PLEG): container finished" podID="865baf70-58f9-4eee-8cf4-d5e96e6d011e" containerID="15a4243f6731fd3dc1f339e745a0a5b4b0c8cf919fa1cca4e9c2c9e856bd624b" exitCode=0 Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.940888 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j4m68" event={"ID":"865baf70-58f9-4eee-8cf4-d5e96e6d011e","Type":"ContainerDied","Data":"15a4243f6731fd3dc1f339e745a0a5b4b0c8cf919fa1cca4e9c2c9e856bd624b"} Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.945520 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b12213-b2ec-4fa5-b848-d06fe7855247","Type":"ContainerStarted","Data":"8773469a391248aa723b82a38b327739121d862fabed4fa45660e65d6ebf6b43"} Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.953206 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kc6d" event={"ID":"3d759943-69c0-4ea1-b8cf-93060971988c","Type":"ContainerStarted","Data":"dc9663d3c5987fdb1b5fe366e6a6ad54a6f0dd43c26c459d48661e740b17a604"} Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.959947 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" event={"ID":"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b","Type":"ContainerStarted","Data":"7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3"} Dec 16 15:12:55 crc kubenswrapper[4728]: I1216 15:12:55.960159 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:12:56 crc kubenswrapper[4728]: I1216 15:12:56.029553 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" podStartSLOduration=7.029530542 podStartE2EDuration="7.029530542s" podCreationTimestamp="2025-12-16 15:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:12:56.015492256 +0000 UTC m=+956.855671240" watchObservedRunningTime="2025-12-16 15:12:56.029530542 +0000 UTC m=+956.869709526" Dec 16 15:12:56 crc kubenswrapper[4728]: I1216 15:12:56.041351 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4kc6d" podStartSLOduration=4.373757403 podStartE2EDuration="15.041330077s" podCreationTimestamp="2025-12-16 15:12:41 +0000 UTC" firstStartedPulling="2025-12-16 15:12:44.717780499 +0000 UTC m=+945.557959493" lastFinishedPulling="2025-12-16 15:12:55.385353183 +0000 UTC m=+956.225532167" observedRunningTime="2025-12-16 15:12:56.032235327 +0000 UTC m=+956.872414311" watchObservedRunningTime="2025-12-16 15:12:56.041330077 +0000 UTC m=+956.881509061" Dec 16 15:12:56 crc kubenswrapper[4728]: I1216 15:12:56.107577 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-2x88b"] Dec 16 15:12:56 crc kubenswrapper[4728]: I1216 15:12:56.116610 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-2x88b"] Dec 16 15:12:56 crc kubenswrapper[4728]: I1216 15:12:56.969678 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j4m68" event={"ID":"865baf70-58f9-4eee-8cf4-d5e96e6d011e","Type":"ContainerStarted","Data":"c6cc63a0d1df1f3beff0e201cea1280200752e3b37ea4e7f96782eeeff26a634"} Dec 16 15:12:57 crc kubenswrapper[4728]: I1216 15:12:57.522654 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63a76e8-4889-4a91-b18a-78229e2818a1" path="/var/lib/kubelet/pods/d63a76e8-4889-4a91-b18a-78229e2818a1/volumes" Dec 16 15:12:57 crc kubenswrapper[4728]: I1216 15:12:57.985616 4728 generic.go:334] "Generic (PLEG): container finished" podID="76f2644a-8bb9-4719-83dd-429202a52446" containerID="b97aba4674722ce4e75420a5f252acc930a6c9514c184ad8cbda15cdc9b742fc" exitCode=0 Dec 16 15:12:57 crc kubenswrapper[4728]: I1216 15:12:57.985660 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"76f2644a-8bb9-4719-83dd-429202a52446","Type":"ContainerDied","Data":"b97aba4674722ce4e75420a5f252acc930a6c9514c184ad8cbda15cdc9b742fc"} Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.002614 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"76f2644a-8bb9-4719-83dd-429202a52446","Type":"ContainerStarted","Data":"13e36d6ada6329df3f08a69aa85a4d4d30f0bc3557263615e30a150a8443979c"} Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.005952 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ccc6t" event={"ID":"effa7d99-cccc-431b-91b6-d4302f7dce22","Type":"ContainerStarted","Data":"16876063da2f819c54b9fe4f104fc80bbfda18c6065b8bea32fc77bc5a2179e8"} Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.009672 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b2809df7-1873-474c-ab44-14b82f630cb0","Type":"ContainerStarted","Data":"e9939765bc1925c038023010ccd19f94c9226a3793d9719ad2a0599b0a10af49"} Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.011765 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d587bd5e-c0c9-48f1-a2b6-616e904ceed3","Type":"ContainerStarted","Data":"eeea3c685477a30312706095f8ffed3235c6717d9b0a0cad86ca634398d5b700"} Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.014384 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j4m68" event={"ID":"865baf70-58f9-4eee-8cf4-d5e96e6d011e","Type":"ContainerStarted","Data":"3cbd6f653d377fb71d99cc442a9840c6fe6dcc2aa98c3ffebdca5deda2cae086"} Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.014654 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.014678 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.019948 4728 generic.go:334] "Generic (PLEG): container finished" podID="eb629e93-c552-47c3-8c89-11254ffa834f" containerID="1ff0e07080d243b5a0af098e36e82b83ffe1af86db0405eea4070569a215a212" exitCode=0 Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.019986 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eb629e93-c552-47c3-8c89-11254ffa834f","Type":"ContainerDied","Data":"1ff0e07080d243b5a0af098e36e82b83ffe1af86db0405eea4070569a215a212"} Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.064348 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.204433066 podStartE2EDuration="25.064329381s" podCreationTimestamp="2025-12-16 15:12:34 +0000 UTC" firstStartedPulling="2025-12-16 15:12:44.206837235 +0000 UTC m=+945.047016219" lastFinishedPulling="2025-12-16 15:12:58.06673355 +0000 UTC m=+958.906912534" observedRunningTime="2025-12-16 15:12:59.056453814 +0000 UTC m=+959.896632798" watchObservedRunningTime="2025-12-16 15:12:59.064329381 +0000 UTC m=+959.904508365" Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.067868 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=23.985316342 podStartE2EDuration="32.067850648s" podCreationTimestamp="2025-12-16 15:12:27 +0000 UTC" firstStartedPulling="2025-12-16 15:12:44.076770068 +0000 UTC m=+944.916949052" lastFinishedPulling="2025-12-16 15:12:52.159304354 +0000 UTC m=+952.999483358" observedRunningTime="2025-12-16 15:12:59.035181139 +0000 UTC m=+959.875360133" watchObservedRunningTime="2025-12-16 15:12:59.067850648 +0000 UTC m=+959.908029642" Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.084507 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ccc6t" podStartSLOduration=5.859206361 podStartE2EDuration="10.084467335s" podCreationTimestamp="2025-12-16 15:12:49 +0000 UTC" firstStartedPulling="2025-12-16 15:12:53.83253767 +0000 UTC m=+954.672716654" lastFinishedPulling="2025-12-16 15:12:58.057798644 +0000 UTC m=+958.897977628" observedRunningTime="2025-12-16 15:12:59.081037641 +0000 UTC m=+959.921216625" watchObservedRunningTime="2025-12-16 15:12:59.084467335 +0000 UTC m=+959.924646329" Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.119760 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-j4m68" podStartSLOduration=15.820118677 podStartE2EDuration="24.119739675s" podCreationTimestamp="2025-12-16 15:12:35 +0000 UTC" firstStartedPulling="2025-12-16 15:12:44.31499069 +0000 UTC m=+945.155169664" lastFinishedPulling="2025-12-16 15:12:52.614611678 +0000 UTC m=+953.454790662" observedRunningTime="2025-12-16 15:12:59.106951963 +0000 UTC m=+959.947130957" watchObservedRunningTime="2025-12-16 15:12:59.119739675 +0000 UTC m=+959.959918659" Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.168530 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.477042596 podStartE2EDuration="23.168511817s" podCreationTimestamp="2025-12-16 15:12:36 +0000 UTC" firstStartedPulling="2025-12-16 15:12:44.429950543 +0000 UTC m=+945.270129527" lastFinishedPulling="2025-12-16 15:12:58.121419764 +0000 UTC m=+958.961598748" observedRunningTime="2025-12-16 15:12:59.163976932 +0000 UTC m=+960.004155926" watchObservedRunningTime="2025-12-16 15:12:59.168511817 +0000 UTC m=+960.008690801" Dec 16 15:12:59 crc kubenswrapper[4728]: E1216 15:12:59.535968 4728 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/5e7acfedc3cc20f27a09c2e413ed3a15222d8f71d4389878e1df429bfc39c163/diff" to get inode usage: stat /var/lib/containers/storage/overlay/5e7acfedc3cc20f27a09c2e413ed3a15222d8f71d4389878e1df429bfc39c163/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openshift-marketplace_certified-operators-jf5hh_06a939f4-82ee-43e0-8a85-ad8db9e76b64/extract-content/0.log" to get inode usage: stat /var/log/pods/openshift-marketplace_certified-operators-jf5hh_06a939f4-82ee-43e0-8a85-ad8db9e76b64/extract-content/0.log: no such file or directory Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.705893 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 16 15:12:59 crc kubenswrapper[4728]: I1216 15:12:59.763904 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:00 crc kubenswrapper[4728]: I1216 15:13:00.030449 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eb629e93-c552-47c3-8c89-11254ffa834f","Type":"ContainerStarted","Data":"8cf53e1ce61f6617508c1ccf2b85750de5868e2f7b795c485843c34b82ee2500"} Dec 16 15:13:00 crc kubenswrapper[4728]: I1216 15:13:00.030511 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:00 crc kubenswrapper[4728]: I1216 15:13:00.074846 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.598033599 podStartE2EDuration="34.074826216s" podCreationTimestamp="2025-12-16 15:12:26 +0000 UTC" firstStartedPulling="2025-12-16 15:12:44.118380472 +0000 UTC m=+944.958559456" lastFinishedPulling="2025-12-16 15:12:50.595173099 +0000 UTC m=+951.435352073" observedRunningTime="2025-12-16 15:13:00.050094037 +0000 UTC m=+960.890273051" watchObservedRunningTime="2025-12-16 15:13:00.074826216 +0000 UTC m=+960.915005210" Dec 16 15:13:00 crc kubenswrapper[4728]: I1216 15:13:00.080591 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 16 15:13:00 crc kubenswrapper[4728]: I1216 15:13:00.370612 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:13:00 crc kubenswrapper[4728]: I1216 15:13:00.421874 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-b86k8"] Dec 16 15:13:00 crc kubenswrapper[4728]: I1216 15:13:00.422113 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" podUID="2db861e2-eceb-4128-a546-8c34cc829276" containerName="dnsmasq-dns" containerID="cri-o://7a381d766b0fc86fd27fb3b714fb2ba87e1dcbe75bde3162cacdb6ecef72e8c6" gracePeriod=10 Dec 16 15:13:00 crc kubenswrapper[4728]: I1216 15:13:00.617992 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:00 crc kubenswrapper[4728]: I1216 15:13:00.618036 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:00 crc kubenswrapper[4728]: I1216 15:13:00.782196 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 16 15:13:00 crc kubenswrapper[4728]: I1216 15:13:00.896720 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.037249 4728 generic.go:334] "Generic (PLEG): container finished" podID="2db861e2-eceb-4128-a546-8c34cc829276" containerID="7a381d766b0fc86fd27fb3b714fb2ba87e1dcbe75bde3162cacdb6ecef72e8c6" exitCode=0 Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.037538 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" event={"ID":"2db861e2-eceb-4128-a546-8c34cc829276","Type":"ContainerDied","Data":"7a381d766b0fc86fd27fb3b714fb2ba87e1dcbe75bde3162cacdb6ecef72e8c6"} Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.402390 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.474166 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-dns-svc\") pod \"2db861e2-eceb-4128-a546-8c34cc829276\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.474260 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-config\") pod \"2db861e2-eceb-4128-a546-8c34cc829276\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.474299 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zptbz\" (UniqueName: \"kubernetes.io/projected/2db861e2-eceb-4128-a546-8c34cc829276-kube-api-access-zptbz\") pod \"2db861e2-eceb-4128-a546-8c34cc829276\" (UID: \"2db861e2-eceb-4128-a546-8c34cc829276\") " Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.479633 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db861e2-eceb-4128-a546-8c34cc829276-kube-api-access-zptbz" (OuterVolumeSpecName: "kube-api-access-zptbz") pod "2db861e2-eceb-4128-a546-8c34cc829276" (UID: "2db861e2-eceb-4128-a546-8c34cc829276"). InnerVolumeSpecName "kube-api-access-zptbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.511168 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2db861e2-eceb-4128-a546-8c34cc829276" (UID: "2db861e2-eceb-4128-a546-8c34cc829276"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.511273 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-config" (OuterVolumeSpecName: "config") pod "2db861e2-eceb-4128-a546-8c34cc829276" (UID: "2db861e2-eceb-4128-a546-8c34cc829276"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.576716 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.576749 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db861e2-eceb-4128-a546-8c34cc829276-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.576852 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zptbz\" (UniqueName: \"kubernetes.io/projected/2db861e2-eceb-4128-a546-8c34cc829276-kube-api-access-zptbz\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.864737 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.864791 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.906983 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.948040 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:13:01 crc kubenswrapper[4728]: I1216 15:13:01.954418 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.048360 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" event={"ID":"2db861e2-eceb-4128-a546-8c34cc829276","Type":"ContainerDied","Data":"c480295e6d9b5f6d633deff6946735d65099dcda59c7c38894b7439a87ba2fe7"} Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.048453 4728 scope.go:117] "RemoveContainer" containerID="7a381d766b0fc86fd27fb3b714fb2ba87e1dcbe75bde3162cacdb6ecef72e8c6" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.048855 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-b86k8" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.048896 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.077759 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-b86k8"] Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.081275 4728 scope.go:117] "RemoveContainer" containerID="6fd4884b3c6e09769ee17288ab4c2b53688a0ef4e041efdfc25b2deb1b71ff8b" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.083910 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-b86k8"] Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.113357 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.120991 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.246215 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4kc6d"] Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.313594 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 16 15:13:02 crc kubenswrapper[4728]: E1216 15:13:02.313893 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63a76e8-4889-4a91-b18a-78229e2818a1" containerName="init" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.313912 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63a76e8-4889-4a91-b18a-78229e2818a1" containerName="init" Dec 16 15:13:02 crc kubenswrapper[4728]: E1216 15:13:02.313937 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db861e2-eceb-4128-a546-8c34cc829276" containerName="dnsmasq-dns" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.313945 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db861e2-eceb-4128-a546-8c34cc829276" containerName="dnsmasq-dns" Dec 16 15:13:02 crc kubenswrapper[4728]: E1216 15:13:02.313954 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7136f947-a2e9-45b7-8add-348387eb9645" containerName="dnsmasq-dns" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.313960 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7136f947-a2e9-45b7-8add-348387eb9645" containerName="dnsmasq-dns" Dec 16 15:13:02 crc kubenswrapper[4728]: E1216 15:13:02.313975 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7136f947-a2e9-45b7-8add-348387eb9645" containerName="init" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.313980 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7136f947-a2e9-45b7-8add-348387eb9645" containerName="init" Dec 16 15:13:02 crc kubenswrapper[4728]: E1216 15:13:02.314009 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db861e2-eceb-4128-a546-8c34cc829276" containerName="init" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.314015 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db861e2-eceb-4128-a546-8c34cc829276" containerName="init" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.314168 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7136f947-a2e9-45b7-8add-348387eb9645" containerName="dnsmasq-dns" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.314180 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db861e2-eceb-4128-a546-8c34cc829276" containerName="dnsmasq-dns" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.314195 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63a76e8-4889-4a91-b18a-78229e2818a1" containerName="init" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.315783 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.318194 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.318352 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4hsqv" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.318482 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.318863 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.332346 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.390651 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-scripts\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.390722 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-config\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.390821 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.390861 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.390903 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.390950 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q85r\" (UniqueName: \"kubernetes.io/projected/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-kube-api-access-9q85r\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.391014 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.492301 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.492361 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-scripts\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.492401 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-config\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.492489 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.492512 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.492535 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.492574 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q85r\" (UniqueName: \"kubernetes.io/projected/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-kube-api-access-9q85r\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.493676 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.494026 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-config\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.494910 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-scripts\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.497274 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.497377 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.499116 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.508617 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q85r\" (UniqueName: \"kubernetes.io/projected/d390ca4f-5aa2-45e8-a08e-b2e86218e36f-kube-api-access-9q85r\") pod \"ovn-northd-0\" (UID: \"d390ca4f-5aa2-45e8-a08e-b2e86218e36f\") " pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: E1216 15:13:02.541171 4728 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.210:39348->38.102.83.210:34353: write tcp 38.102.83.210:39348->38.102.83.210:34353: write: broken pipe Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.632576 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 16 15:13:02 crc kubenswrapper[4728]: I1216 15:13:02.962597 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 16 15:13:03 crc kubenswrapper[4728]: I1216 15:13:03.055814 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d390ca4f-5aa2-45e8-a08e-b2e86218e36f","Type":"ContainerStarted","Data":"11b864cff47b5aa7e3dcae911f347b4fab40749a6340af6f3d97eefbe81615d5"} Dec 16 15:13:03 crc kubenswrapper[4728]: I1216 15:13:03.517341 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db861e2-eceb-4128-a546-8c34cc829276" path="/var/lib/kubelet/pods/2db861e2-eceb-4128-a546-8c34cc829276/volumes" Dec 16 15:13:04 crc kubenswrapper[4728]: I1216 15:13:04.064032 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4kc6d" podUID="3d759943-69c0-4ea1-b8cf-93060971988c" containerName="registry-server" containerID="cri-o://dc9663d3c5987fdb1b5fe366e6a6ad54a6f0dd43c26c459d48661e740b17a604" gracePeriod=2 Dec 16 15:13:05 crc kubenswrapper[4728]: I1216 15:13:05.076212 4728 generic.go:334] "Generic (PLEG): container finished" podID="3d759943-69c0-4ea1-b8cf-93060971988c" containerID="dc9663d3c5987fdb1b5fe366e6a6ad54a6f0dd43c26c459d48661e740b17a604" exitCode=0 Dec 16 15:13:05 crc kubenswrapper[4728]: I1216 15:13:05.076292 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kc6d" event={"ID":"3d759943-69c0-4ea1-b8cf-93060971988c","Type":"ContainerDied","Data":"dc9663d3c5987fdb1b5fe366e6a6ad54a6f0dd43c26c459d48661e740b17a604"} Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.045771 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.090974 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kc6d" event={"ID":"3d759943-69c0-4ea1-b8cf-93060971988c","Type":"ContainerDied","Data":"fdecd74d1f9f5a52f25aa30f402179f6191bdc610876a45f9fe2442229346741"} Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.091028 4728 scope.go:117] "RemoveContainer" containerID="dc9663d3c5987fdb1b5fe366e6a6ad54a6f0dd43c26c459d48661e740b17a604" Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.091066 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kc6d" Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.157357 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr2gk\" (UniqueName: \"kubernetes.io/projected/3d759943-69c0-4ea1-b8cf-93060971988c-kube-api-access-fr2gk\") pod \"3d759943-69c0-4ea1-b8cf-93060971988c\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.157574 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-utilities\") pod \"3d759943-69c0-4ea1-b8cf-93060971988c\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.157617 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-catalog-content\") pod \"3d759943-69c0-4ea1-b8cf-93060971988c\" (UID: \"3d759943-69c0-4ea1-b8cf-93060971988c\") " Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.158537 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-utilities" (OuterVolumeSpecName: "utilities") pod "3d759943-69c0-4ea1-b8cf-93060971988c" (UID: "3d759943-69c0-4ea1-b8cf-93060971988c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.163654 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d759943-69c0-4ea1-b8cf-93060971988c-kube-api-access-fr2gk" (OuterVolumeSpecName: "kube-api-access-fr2gk") pod "3d759943-69c0-4ea1-b8cf-93060971988c" (UID: "3d759943-69c0-4ea1-b8cf-93060971988c"). InnerVolumeSpecName "kube-api-access-fr2gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.210755 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d759943-69c0-4ea1-b8cf-93060971988c" (UID: "3d759943-69c0-4ea1-b8cf-93060971988c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.259558 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr2gk\" (UniqueName: \"kubernetes.io/projected/3d759943-69c0-4ea1-b8cf-93060971988c-kube-api-access-fr2gk\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.259595 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.259606 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d759943-69c0-4ea1-b8cf-93060971988c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.437801 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4kc6d"] Dec 16 15:13:06 crc kubenswrapper[4728]: I1216 15:13:06.445461 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4kc6d"] Dec 16 15:13:07 crc kubenswrapper[4728]: I1216 15:13:07.514763 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d759943-69c0-4ea1-b8cf-93060971988c" path="/var/lib/kubelet/pods/3d759943-69c0-4ea1-b8cf-93060971988c/volumes" Dec 16 15:13:07 crc kubenswrapper[4728]: I1216 15:13:07.541634 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 16 15:13:07 crc kubenswrapper[4728]: I1216 15:13:07.541695 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 16 15:13:08 crc kubenswrapper[4728]: I1216 15:13:08.299111 4728 scope.go:117] "RemoveContainer" containerID="a61cb109dc16a8300f7085f639a21f81606ba222410495e1effb08f1539e5353" Dec 16 15:13:08 crc kubenswrapper[4728]: I1216 15:13:08.322906 4728 scope.go:117] "RemoveContainer" containerID="5d54a14280ffcd49a5c68a799bd99a4e93ad5c301edce2bbbf8243f820cf573f" Dec 16 15:13:10 crc kubenswrapper[4728]: I1216 15:13:10.935640 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-jmrcl"] Dec 16 15:13:10 crc kubenswrapper[4728]: E1216 15:13:10.936184 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d759943-69c0-4ea1-b8cf-93060971988c" containerName="registry-server" Dec 16 15:13:10 crc kubenswrapper[4728]: I1216 15:13:10.936197 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d759943-69c0-4ea1-b8cf-93060971988c" containerName="registry-server" Dec 16 15:13:10 crc kubenswrapper[4728]: E1216 15:13:10.936216 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d759943-69c0-4ea1-b8cf-93060971988c" containerName="extract-content" Dec 16 15:13:10 crc kubenswrapper[4728]: I1216 15:13:10.936224 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d759943-69c0-4ea1-b8cf-93060971988c" containerName="extract-content" Dec 16 15:13:10 crc kubenswrapper[4728]: E1216 15:13:10.936242 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d759943-69c0-4ea1-b8cf-93060971988c" containerName="extract-utilities" Dec 16 15:13:10 crc kubenswrapper[4728]: I1216 15:13:10.936250 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d759943-69c0-4ea1-b8cf-93060971988c" containerName="extract-utilities" Dec 16 15:13:10 crc kubenswrapper[4728]: I1216 15:13:10.936418 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d759943-69c0-4ea1-b8cf-93060971988c" containerName="registry-server" Dec 16 15:13:10 crc kubenswrapper[4728]: I1216 15:13:10.939964 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:10 crc kubenswrapper[4728]: I1216 15:13:10.947172 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jmrcl"] Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.040883 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-dns-svc\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.041418 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-config\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.041454 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.041928 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqffm\" (UniqueName: \"kubernetes.io/projected/a7035612-bffa-4357-aa7e-897240b10c43-kube-api-access-sqffm\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.042050 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.144843 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.144928 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-dns-svc\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.144976 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-config\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.145028 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.145194 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqffm\" (UniqueName: \"kubernetes.io/projected/a7035612-bffa-4357-aa7e-897240b10c43-kube-api-access-sqffm\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.146219 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.146750 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-dns-svc\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.147291 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-config\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.147869 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.164485 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqffm\" (UniqueName: \"kubernetes.io/projected/a7035612-bffa-4357-aa7e-897240b10c43-kube-api-access-sqffm\") pod \"dnsmasq-dns-698758b865-jmrcl\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.265950 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.656121 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.721313 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jmrcl"] Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.749049 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.771840 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:11 crc kubenswrapper[4728]: I1216 15:13:11.864658 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.026568 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.031382 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.036481 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.036733 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.036760 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.036781 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-267n4" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.044993 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.142006 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jmrcl" event={"ID":"a7035612-bffa-4357-aa7e-897240b10c43","Type":"ContainerStarted","Data":"73281e5377ce7eb69d7ece92f1102d4238b126bbf8a093e07625156d918ce09d"} Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.167397 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.167760 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-657xg\" (UniqueName: \"kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-kube-api-access-657xg\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.167881 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fc3761f8-7e22-45e1-8119-a40338b80f1d-cache\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.167918 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.167970 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fc3761f8-7e22-45e1-8119-a40338b80f1d-lock\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.269307 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-657xg\" (UniqueName: \"kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-kube-api-access-657xg\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.269452 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fc3761f8-7e22-45e1-8119-a40338b80f1d-cache\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.269478 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.269562 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fc3761f8-7e22-45e1-8119-a40338b80f1d-lock\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.269583 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: E1216 15:13:12.269716 4728 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 15:13:12 crc kubenswrapper[4728]: E1216 15:13:12.269729 4728 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 15:13:12 crc kubenswrapper[4728]: E1216 15:13:12.269778 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift podName:fc3761f8-7e22-45e1-8119-a40338b80f1d nodeName:}" failed. No retries permitted until 2025-12-16 15:13:12.769761464 +0000 UTC m=+973.609940448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift") pod "swift-storage-0" (UID: "fc3761f8-7e22-45e1-8119-a40338b80f1d") : configmap "swift-ring-files" not found Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.270637 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.271477 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fc3761f8-7e22-45e1-8119-a40338b80f1d-cache\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.271706 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fc3761f8-7e22-45e1-8119-a40338b80f1d-lock\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.290051 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-657xg\" (UniqueName: \"kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-kube-api-access-657xg\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.305548 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.557726 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-f45d6"] Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.558639 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.561885 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.561947 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.570365 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.581882 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-f45d6"] Dec 16 15:13:12 crc kubenswrapper[4728]: E1216 15:13:12.582461 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-hzcmp ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-hzcmp ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-f45d6" podUID="c407857a-ebf4-49b1-9347-3ecdc450ac39" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.589192 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vvql8"] Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.590100 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.598374 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-f45d6"] Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.606450 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vvql8"] Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.675247 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-combined-ca-bundle\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.675303 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-ring-data-devices\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.675425 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-dispersionconf\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.675496 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-swiftconf\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.675554 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4847\" (UniqueName: \"kubernetes.io/projected/55ebd6bb-cac2-4b8f-932d-46662c011b18-kube-api-access-g4847\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.675716 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-ring-data-devices\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.675854 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-scripts\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.675881 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c407857a-ebf4-49b1-9347-3ecdc450ac39-etc-swift\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.675915 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-scripts\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.675977 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzcmp\" (UniqueName: \"kubernetes.io/projected/c407857a-ebf4-49b1-9347-3ecdc450ac39-kube-api-access-hzcmp\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.676036 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-dispersionconf\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.676103 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-swiftconf\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.676148 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-combined-ca-bundle\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.676572 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55ebd6bb-cac2-4b8f-932d-46662c011b18-etc-swift\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.778291 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-dispersionconf\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.778341 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-swiftconf\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.778377 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4847\" (UniqueName: \"kubernetes.io/projected/55ebd6bb-cac2-4b8f-932d-46662c011b18-kube-api-access-g4847\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.778448 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-ring-data-devices\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.778823 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-scripts\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.778862 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c407857a-ebf4-49b1-9347-3ecdc450ac39-etc-swift\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.779496 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-scripts\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.779880 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzcmp\" (UniqueName: \"kubernetes.io/projected/c407857a-ebf4-49b1-9347-3ecdc450ac39-kube-api-access-hzcmp\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.779565 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-scripts\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.779384 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-ring-data-devices\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.779437 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c407857a-ebf4-49b1-9347-3ecdc450ac39-etc-swift\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.780230 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-scripts\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.779906 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-dispersionconf\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.780542 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.780597 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-swiftconf\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: E1216 15:13:12.780670 4728 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 15:13:12 crc kubenswrapper[4728]: E1216 15:13:12.780700 4728 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 15:13:12 crc kubenswrapper[4728]: E1216 15:13:12.780753 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift podName:fc3761f8-7e22-45e1-8119-a40338b80f1d nodeName:}" failed. No retries permitted until 2025-12-16 15:13:13.780732559 +0000 UTC m=+974.620911613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift") pod "swift-storage-0" (UID: "fc3761f8-7e22-45e1-8119-a40338b80f1d") : configmap "swift-ring-files" not found Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.780786 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-combined-ca-bundle\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.780912 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55ebd6bb-cac2-4b8f-932d-46662c011b18-etc-swift\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.780988 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-combined-ca-bundle\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.781021 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-ring-data-devices\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.781525 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55ebd6bb-cac2-4b8f-932d-46662c011b18-etc-swift\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.783281 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-ring-data-devices\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.783932 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-dispersionconf\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.784186 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-dispersionconf\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.784634 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-swiftconf\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.785585 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-combined-ca-bundle\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.787851 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-swiftconf\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.788434 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-combined-ca-bundle\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.801993 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzcmp\" (UniqueName: \"kubernetes.io/projected/c407857a-ebf4-49b1-9347-3ecdc450ac39-kube-api-access-hzcmp\") pod \"swift-ring-rebalance-f45d6\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.807204 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4847\" (UniqueName: \"kubernetes.io/projected/55ebd6bb-cac2-4b8f-932d-46662c011b18-kube-api-access-g4847\") pod \"swift-ring-rebalance-vvql8\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:12 crc kubenswrapper[4728]: I1216 15:13:12.957552 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.152877 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d390ca4f-5aa2-45e8-a08e-b2e86218e36f","Type":"ContainerStarted","Data":"1f768d32c3fd85800311577600247312f5ca684993ee3b041c0443b60abc6bb7"} Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.152949 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d390ca4f-5aa2-45e8-a08e-b2e86218e36f","Type":"ContainerStarted","Data":"13c5d37280772f69c002581387467ef89edb0e710e10ab9e1109e8ce5b67c662"} Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.153040 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.157640 4728 generic.go:334] "Generic (PLEG): container finished" podID="a7035612-bffa-4357-aa7e-897240b10c43" containerID="f4cd2463810d570605af672c0280bed5f051a2901ad811a329f0cd38cb34d9ad" exitCode=0 Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.157726 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.157734 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jmrcl" event={"ID":"a7035612-bffa-4357-aa7e-897240b10c43","Type":"ContainerDied","Data":"f4cd2463810d570605af672c0280bed5f051a2901ad811a329f0cd38cb34d9ad"} Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.223737 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.099060605 podStartE2EDuration="11.223722177s" podCreationTimestamp="2025-12-16 15:13:02 +0000 UTC" firstStartedPulling="2025-12-16 15:13:02.96255492 +0000 UTC m=+963.802733904" lastFinishedPulling="2025-12-16 15:13:12.087216492 +0000 UTC m=+972.927395476" observedRunningTime="2025-12-16 15:13:13.188250174 +0000 UTC m=+974.028429168" watchObservedRunningTime="2025-12-16 15:13:13.223722177 +0000 UTC m=+974.063901161" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.244952 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vvql8"] Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.287338 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.391911 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-dispersionconf\") pod \"c407857a-ebf4-49b1-9347-3ecdc450ac39\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.391999 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-swiftconf\") pod \"c407857a-ebf4-49b1-9347-3ecdc450ac39\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.392082 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-scripts\") pod \"c407857a-ebf4-49b1-9347-3ecdc450ac39\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.392118 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c407857a-ebf4-49b1-9347-3ecdc450ac39-etc-swift\") pod \"c407857a-ebf4-49b1-9347-3ecdc450ac39\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.392365 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzcmp\" (UniqueName: \"kubernetes.io/projected/c407857a-ebf4-49b1-9347-3ecdc450ac39-kube-api-access-hzcmp\") pod \"c407857a-ebf4-49b1-9347-3ecdc450ac39\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.392550 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c407857a-ebf4-49b1-9347-3ecdc450ac39-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c407857a-ebf4-49b1-9347-3ecdc450ac39" (UID: "c407857a-ebf4-49b1-9347-3ecdc450ac39"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.392724 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-combined-ca-bundle\") pod \"c407857a-ebf4-49b1-9347-3ecdc450ac39\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.392903 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-ring-data-devices\") pod \"c407857a-ebf4-49b1-9347-3ecdc450ac39\" (UID: \"c407857a-ebf4-49b1-9347-3ecdc450ac39\") " Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.393620 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-scripts" (OuterVolumeSpecName: "scripts") pod "c407857a-ebf4-49b1-9347-3ecdc450ac39" (UID: "c407857a-ebf4-49b1-9347-3ecdc450ac39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.394082 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.394174 4728 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c407857a-ebf4-49b1-9347-3ecdc450ac39-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.394178 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c407857a-ebf4-49b1-9347-3ecdc450ac39" (UID: "c407857a-ebf4-49b1-9347-3ecdc450ac39"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.396809 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c407857a-ebf4-49b1-9347-3ecdc450ac39" (UID: "c407857a-ebf4-49b1-9347-3ecdc450ac39"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.396957 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c407857a-ebf4-49b1-9347-3ecdc450ac39" (UID: "c407857a-ebf4-49b1-9347-3ecdc450ac39"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.396984 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c407857a-ebf4-49b1-9347-3ecdc450ac39-kube-api-access-hzcmp" (OuterVolumeSpecName: "kube-api-access-hzcmp") pod "c407857a-ebf4-49b1-9347-3ecdc450ac39" (UID: "c407857a-ebf4-49b1-9347-3ecdc450ac39"). InnerVolumeSpecName "kube-api-access-hzcmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.398174 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c407857a-ebf4-49b1-9347-3ecdc450ac39" (UID: "c407857a-ebf4-49b1-9347-3ecdc450ac39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.495602 4728 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.495636 4728 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.495645 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzcmp\" (UniqueName: \"kubernetes.io/projected/c407857a-ebf4-49b1-9347-3ecdc450ac39-kube-api-access-hzcmp\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.495655 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c407857a-ebf4-49b1-9347-3ecdc450ac39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.495665 4728 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c407857a-ebf4-49b1-9347-3ecdc450ac39-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:13 crc kubenswrapper[4728]: I1216 15:13:13.801874 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:13 crc kubenswrapper[4728]: E1216 15:13:13.802139 4728 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 15:13:13 crc kubenswrapper[4728]: E1216 15:13:13.802451 4728 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 15:13:13 crc kubenswrapper[4728]: E1216 15:13:13.802551 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift podName:fc3761f8-7e22-45e1-8119-a40338b80f1d nodeName:}" failed. No retries permitted until 2025-12-16 15:13:15.80252348 +0000 UTC m=+976.642702504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift") pod "swift-storage-0" (UID: "fc3761f8-7e22-45e1-8119-a40338b80f1d") : configmap "swift-ring-files" not found Dec 16 15:13:14 crc kubenswrapper[4728]: I1216 15:13:14.174080 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vvql8" event={"ID":"55ebd6bb-cac2-4b8f-932d-46662c011b18","Type":"ContainerStarted","Data":"584e1624d0f9fcd2de0f6843381f76ff8f59c517350341d81ff229f9e5c3d8c1"} Dec 16 15:13:14 crc kubenswrapper[4728]: I1216 15:13:14.179841 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f45d6" Dec 16 15:13:14 crc kubenswrapper[4728]: I1216 15:13:14.180216 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jmrcl" event={"ID":"a7035612-bffa-4357-aa7e-897240b10c43","Type":"ContainerStarted","Data":"81615126d01d5494e4472780ee7dfba71331aadf757f7b5a51b1e7d2ef5ba66b"} Dec 16 15:13:14 crc kubenswrapper[4728]: I1216 15:13:14.180748 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:14 crc kubenswrapper[4728]: I1216 15:13:14.212211 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-jmrcl" podStartSLOduration=4.212193131 podStartE2EDuration="4.212193131s" podCreationTimestamp="2025-12-16 15:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:13:14.204153757 +0000 UTC m=+975.044332741" watchObservedRunningTime="2025-12-16 15:13:14.212193131 +0000 UTC m=+975.052372115" Dec 16 15:13:14 crc kubenswrapper[4728]: I1216 15:13:14.263036 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-f45d6"] Dec 16 15:13:14 crc kubenswrapper[4728]: I1216 15:13:14.270650 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-f45d6"] Dec 16 15:13:15 crc kubenswrapper[4728]: I1216 15:13:15.518082 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c407857a-ebf4-49b1-9347-3ecdc450ac39" path="/var/lib/kubelet/pods/c407857a-ebf4-49b1-9347-3ecdc450ac39/volumes" Dec 16 15:13:15 crc kubenswrapper[4728]: I1216 15:13:15.837336 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:15 crc kubenswrapper[4728]: E1216 15:13:15.837646 4728 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 15:13:15 crc kubenswrapper[4728]: E1216 15:13:15.837701 4728 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 15:13:15 crc kubenswrapper[4728]: E1216 15:13:15.837835 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift podName:fc3761f8-7e22-45e1-8119-a40338b80f1d nodeName:}" failed. No retries permitted until 2025-12-16 15:13:19.837792667 +0000 UTC m=+980.677971661 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift") pod "swift-storage-0" (UID: "fc3761f8-7e22-45e1-8119-a40338b80f1d") : configmap "swift-ring-files" not found Dec 16 15:13:17 crc kubenswrapper[4728]: I1216 15:13:17.213743 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vvql8" event={"ID":"55ebd6bb-cac2-4b8f-932d-46662c011b18","Type":"ContainerStarted","Data":"e25003fa40aa710f08944965f1c59c8ec0bf77fa75cd80f0b6b429f67d158f91"} Dec 16 15:13:17 crc kubenswrapper[4728]: I1216 15:13:17.249057 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vvql8" podStartSLOduration=2.1372111289999998 podStartE2EDuration="5.2490373s" podCreationTimestamp="2025-12-16 15:13:12 +0000 UTC" firstStartedPulling="2025-12-16 15:13:13.235546301 +0000 UTC m=+974.075725285" lastFinishedPulling="2025-12-16 15:13:16.347372462 +0000 UTC m=+977.187551456" observedRunningTime="2025-12-16 15:13:17.237798422 +0000 UTC m=+978.077977446" watchObservedRunningTime="2025-12-16 15:13:17.2490373 +0000 UTC m=+978.089216294" Dec 16 15:13:18 crc kubenswrapper[4728]: I1216 15:13:18.934235 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-dc7b-account-create-update-hk9vg"] Dec 16 15:13:18 crc kubenswrapper[4728]: I1216 15:13:18.939492 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dc7b-account-create-update-hk9vg" Dec 16 15:13:18 crc kubenswrapper[4728]: I1216 15:13:18.949821 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bn62h"] Dec 16 15:13:18 crc kubenswrapper[4728]: I1216 15:13:18.951000 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bn62h" Dec 16 15:13:18 crc kubenswrapper[4728]: I1216 15:13:18.960507 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dc7b-account-create-update-hk9vg"] Dec 16 15:13:18 crc kubenswrapper[4728]: I1216 15:13:18.984022 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.003903 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bn62h"] Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.099138 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p866v\" (UniqueName: \"kubernetes.io/projected/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-kube-api-access-p866v\") pod \"keystone-db-create-bn62h\" (UID: \"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4\") " pod="openstack/keystone-db-create-bn62h" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.099178 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfxvq\" (UniqueName: \"kubernetes.io/projected/e422ae6c-3605-4278-93aa-116a092e1f95-kube-api-access-hfxvq\") pod \"keystone-dc7b-account-create-update-hk9vg\" (UID: \"e422ae6c-3605-4278-93aa-116a092e1f95\") " pod="openstack/keystone-dc7b-account-create-update-hk9vg" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.099306 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e422ae6c-3605-4278-93aa-116a092e1f95-operator-scripts\") pod \"keystone-dc7b-account-create-update-hk9vg\" (UID: \"e422ae6c-3605-4278-93aa-116a092e1f95\") " pod="openstack/keystone-dc7b-account-create-update-hk9vg" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.099447 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-operator-scripts\") pod \"keystone-db-create-bn62h\" (UID: \"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4\") " pod="openstack/keystone-db-create-bn62h" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.152050 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-657w4"] Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.154156 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-657w4" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.165571 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-657w4"] Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.200715 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e422ae6c-3605-4278-93aa-116a092e1f95-operator-scripts\") pod \"keystone-dc7b-account-create-update-hk9vg\" (UID: \"e422ae6c-3605-4278-93aa-116a092e1f95\") " pod="openstack/keystone-dc7b-account-create-update-hk9vg" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.200835 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-operator-scripts\") pod \"keystone-db-create-bn62h\" (UID: \"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4\") " pod="openstack/keystone-db-create-bn62h" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.200918 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p866v\" (UniqueName: \"kubernetes.io/projected/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-kube-api-access-p866v\") pod \"keystone-db-create-bn62h\" (UID: \"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4\") " pod="openstack/keystone-db-create-bn62h" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.200941 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfxvq\" (UniqueName: \"kubernetes.io/projected/e422ae6c-3605-4278-93aa-116a092e1f95-kube-api-access-hfxvq\") pod \"keystone-dc7b-account-create-update-hk9vg\" (UID: \"e422ae6c-3605-4278-93aa-116a092e1f95\") " pod="openstack/keystone-dc7b-account-create-update-hk9vg" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.201723 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e422ae6c-3605-4278-93aa-116a092e1f95-operator-scripts\") pod \"keystone-dc7b-account-create-update-hk9vg\" (UID: \"e422ae6c-3605-4278-93aa-116a092e1f95\") " pod="openstack/keystone-dc7b-account-create-update-hk9vg" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.202017 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-operator-scripts\") pod \"keystone-db-create-bn62h\" (UID: \"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4\") " pod="openstack/keystone-db-create-bn62h" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.232320 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p866v\" (UniqueName: \"kubernetes.io/projected/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-kube-api-access-p866v\") pod \"keystone-db-create-bn62h\" (UID: \"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4\") " pod="openstack/keystone-db-create-bn62h" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.234521 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfxvq\" (UniqueName: \"kubernetes.io/projected/e422ae6c-3605-4278-93aa-116a092e1f95-kube-api-access-hfxvq\") pod \"keystone-dc7b-account-create-update-hk9vg\" (UID: \"e422ae6c-3605-4278-93aa-116a092e1f95\") " pod="openstack/keystone-dc7b-account-create-update-hk9vg" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.302590 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfhff\" (UniqueName: \"kubernetes.io/projected/e0214724-b0c1-40f7-b086-6fea171a8500-kube-api-access-zfhff\") pod \"placement-db-create-657w4\" (UID: \"e0214724-b0c1-40f7-b086-6fea171a8500\") " pod="openstack/placement-db-create-657w4" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.303689 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0214724-b0c1-40f7-b086-6fea171a8500-operator-scripts\") pod \"placement-db-create-657w4\" (UID: \"e0214724-b0c1-40f7-b086-6fea171a8500\") " pod="openstack/placement-db-create-657w4" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.313836 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-51a7-account-create-update-l9t68"] Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.315635 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-51a7-account-create-update-l9t68" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.316644 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dc7b-account-create-update-hk9vg" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.317333 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.326884 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bn62h" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.329033 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-51a7-account-create-update-l9t68"] Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.405443 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghxtr\" (UniqueName: \"kubernetes.io/projected/f6444b69-7cc1-4cbd-a266-00a9f064d649-kube-api-access-ghxtr\") pod \"placement-51a7-account-create-update-l9t68\" (UID: \"f6444b69-7cc1-4cbd-a266-00a9f064d649\") " pod="openstack/placement-51a7-account-create-update-l9t68" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.405509 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0214724-b0c1-40f7-b086-6fea171a8500-operator-scripts\") pod \"placement-db-create-657w4\" (UID: \"e0214724-b0c1-40f7-b086-6fea171a8500\") " pod="openstack/placement-db-create-657w4" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.405586 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfhff\" (UniqueName: \"kubernetes.io/projected/e0214724-b0c1-40f7-b086-6fea171a8500-kube-api-access-zfhff\") pod \"placement-db-create-657w4\" (UID: \"e0214724-b0c1-40f7-b086-6fea171a8500\") " pod="openstack/placement-db-create-657w4" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.405634 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6444b69-7cc1-4cbd-a266-00a9f064d649-operator-scripts\") pod \"placement-51a7-account-create-update-l9t68\" (UID: \"f6444b69-7cc1-4cbd-a266-00a9f064d649\") " pod="openstack/placement-51a7-account-create-update-l9t68" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.406522 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0214724-b0c1-40f7-b086-6fea171a8500-operator-scripts\") pod \"placement-db-create-657w4\" (UID: \"e0214724-b0c1-40f7-b086-6fea171a8500\") " pod="openstack/placement-db-create-657w4" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.429941 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfhff\" (UniqueName: \"kubernetes.io/projected/e0214724-b0c1-40f7-b086-6fea171a8500-kube-api-access-zfhff\") pod \"placement-db-create-657w4\" (UID: \"e0214724-b0c1-40f7-b086-6fea171a8500\") " pod="openstack/placement-db-create-657w4" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.472131 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-657w4" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.507993 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghxtr\" (UniqueName: \"kubernetes.io/projected/f6444b69-7cc1-4cbd-a266-00a9f064d649-kube-api-access-ghxtr\") pod \"placement-51a7-account-create-update-l9t68\" (UID: \"f6444b69-7cc1-4cbd-a266-00a9f064d649\") " pod="openstack/placement-51a7-account-create-update-l9t68" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.508179 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6444b69-7cc1-4cbd-a266-00a9f064d649-operator-scripts\") pod \"placement-51a7-account-create-update-l9t68\" (UID: \"f6444b69-7cc1-4cbd-a266-00a9f064d649\") " pod="openstack/placement-51a7-account-create-update-l9t68" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.509400 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6444b69-7cc1-4cbd-a266-00a9f064d649-operator-scripts\") pod \"placement-51a7-account-create-update-l9t68\" (UID: \"f6444b69-7cc1-4cbd-a266-00a9f064d649\") " pod="openstack/placement-51a7-account-create-update-l9t68" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.525477 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghxtr\" (UniqueName: \"kubernetes.io/projected/f6444b69-7cc1-4cbd-a266-00a9f064d649-kube-api-access-ghxtr\") pod \"placement-51a7-account-create-update-l9t68\" (UID: \"f6444b69-7cc1-4cbd-a266-00a9f064d649\") " pod="openstack/placement-51a7-account-create-update-l9t68" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.706256 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-51a7-account-create-update-l9t68" Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.750575 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bn62h"] Dec 16 15:13:19 crc kubenswrapper[4728]: W1216 15:13:19.757788 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda013edbf_b6b3_46ab_b13e_a27d3ddab2c4.slice/crio-db5626bb6098065a0cf63cd1bba3a58954d2ded2c24149ec3867cc36a31f003b WatchSource:0}: Error finding container db5626bb6098065a0cf63cd1bba3a58954d2ded2c24149ec3867cc36a31f003b: Status 404 returned error can't find the container with id db5626bb6098065a0cf63cd1bba3a58954d2ded2c24149ec3867cc36a31f003b Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.848631 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dc7b-account-create-update-hk9vg"] Dec 16 15:13:19 crc kubenswrapper[4728]: W1216 15:13:19.859701 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode422ae6c_3605_4278_93aa_116a092e1f95.slice/crio-140b8792d275a07d534bc64bc1d9044aad960c4d09c020031af136b343e95ef1 WatchSource:0}: Error finding container 140b8792d275a07d534bc64bc1d9044aad960c4d09c020031af136b343e95ef1: Status 404 returned error can't find the container with id 140b8792d275a07d534bc64bc1d9044aad960c4d09c020031af136b343e95ef1 Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.915365 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:19 crc kubenswrapper[4728]: E1216 15:13:19.915544 4728 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 16 15:13:19 crc kubenswrapper[4728]: E1216 15:13:19.915562 4728 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 16 15:13:19 crc kubenswrapper[4728]: E1216 15:13:19.915612 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift podName:fc3761f8-7e22-45e1-8119-a40338b80f1d nodeName:}" failed. No retries permitted until 2025-12-16 15:13:27.915595185 +0000 UTC m=+988.755774189 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift") pod "swift-storage-0" (UID: "fc3761f8-7e22-45e1-8119-a40338b80f1d") : configmap "swift-ring-files" not found Dec 16 15:13:19 crc kubenswrapper[4728]: I1216 15:13:19.919061 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-657w4"] Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.177435 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-51a7-account-create-update-l9t68"] Dec 16 15:13:20 crc kubenswrapper[4728]: W1216 15:13:20.178056 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6444b69_7cc1_4cbd_a266_00a9f064d649.slice/crio-0ca5a94119c67192cd2e21dd21551e5d0a5dec5dbb8d714f035302993d07dcb1 WatchSource:0}: Error finding container 0ca5a94119c67192cd2e21dd21551e5d0a5dec5dbb8d714f035302993d07dcb1: Status 404 returned error can't find the container with id 0ca5a94119c67192cd2e21dd21551e5d0a5dec5dbb8d714f035302993d07dcb1 Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.254749 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-657w4" event={"ID":"e0214724-b0c1-40f7-b086-6fea171a8500","Type":"ContainerStarted","Data":"94a15d80aa5b91748fc1a632d6d94a173c4f29ac38a30b09c7e1bc8e2f3f1e89"} Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.254814 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-657w4" event={"ID":"e0214724-b0c1-40f7-b086-6fea171a8500","Type":"ContainerStarted","Data":"5b732bb1988bba75b553d10711c126a4ee4e37c59b90056654246f9b262e5ffb"} Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.257268 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dc7b-account-create-update-hk9vg" event={"ID":"e422ae6c-3605-4278-93aa-116a092e1f95","Type":"ContainerStarted","Data":"a66f0e0506808fcea0552504cc4edd82c8d81cb27c928d490cd7f70e6adf5538"} Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.257330 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dc7b-account-create-update-hk9vg" event={"ID":"e422ae6c-3605-4278-93aa-116a092e1f95","Type":"ContainerStarted","Data":"140b8792d275a07d534bc64bc1d9044aad960c4d09c020031af136b343e95ef1"} Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.265105 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-51a7-account-create-update-l9t68" event={"ID":"f6444b69-7cc1-4cbd-a266-00a9f064d649","Type":"ContainerStarted","Data":"0ca5a94119c67192cd2e21dd21551e5d0a5dec5dbb8d714f035302993d07dcb1"} Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.267176 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bn62h" event={"ID":"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4","Type":"ContainerStarted","Data":"57cbacc438f5b3f2c03d8d28bd12b246ef6596f07422e1d8d0db89b9d552c8cc"} Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.267240 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bn62h" event={"ID":"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4","Type":"ContainerStarted","Data":"db5626bb6098065a0cf63cd1bba3a58954d2ded2c24149ec3867cc36a31f003b"} Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.283672 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-657w4" podStartSLOduration=1.28364462 podStartE2EDuration="1.28364462s" podCreationTimestamp="2025-12-16 15:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:13:20.275190586 +0000 UTC m=+981.115369580" watchObservedRunningTime="2025-12-16 15:13:20.28364462 +0000 UTC m=+981.123823634" Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.307156 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-dc7b-account-create-update-hk9vg" podStartSLOduration=2.307128324 podStartE2EDuration="2.307128324s" podCreationTimestamp="2025-12-16 15:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:13:20.29301317 +0000 UTC m=+981.133192174" watchObservedRunningTime="2025-12-16 15:13:20.307128324 +0000 UTC m=+981.147307328" Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.315533 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-bn62h" podStartSLOduration=2.3155135270000002 podStartE2EDuration="2.315513527s" podCreationTimestamp="2025-12-16 15:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:13:20.308290775 +0000 UTC m=+981.148469789" watchObservedRunningTime="2025-12-16 15:13:20.315513527 +0000 UTC m=+981.155692521" Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.946436 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-qvsjj"] Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.950109 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qvsjj" Dec 16 15:13:20 crc kubenswrapper[4728]: I1216 15:13:20.960322 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qvsjj"] Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.039982 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb4k5\" (UniqueName: \"kubernetes.io/projected/923a5238-0877-49bc-8b92-37cab936f43f-kube-api-access-bb4k5\") pod \"glance-db-create-qvsjj\" (UID: \"923a5238-0877-49bc-8b92-37cab936f43f\") " pod="openstack/glance-db-create-qvsjj" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.040050 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/923a5238-0877-49bc-8b92-37cab936f43f-operator-scripts\") pod \"glance-db-create-qvsjj\" (UID: \"923a5238-0877-49bc-8b92-37cab936f43f\") " pod="openstack/glance-db-create-qvsjj" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.041121 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2c3e-account-create-update-smqdm"] Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.042118 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3e-account-create-update-smqdm" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.044452 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.053786 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2c3e-account-create-update-smqdm"] Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.142350 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk8ct\" (UniqueName: \"kubernetes.io/projected/23b0ce26-d151-49ef-af8e-8ca42ffe3944-kube-api-access-qk8ct\") pod \"glance-2c3e-account-create-update-smqdm\" (UID: \"23b0ce26-d151-49ef-af8e-8ca42ffe3944\") " pod="openstack/glance-2c3e-account-create-update-smqdm" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.142557 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b0ce26-d151-49ef-af8e-8ca42ffe3944-operator-scripts\") pod \"glance-2c3e-account-create-update-smqdm\" (UID: \"23b0ce26-d151-49ef-af8e-8ca42ffe3944\") " pod="openstack/glance-2c3e-account-create-update-smqdm" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.142644 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb4k5\" (UniqueName: \"kubernetes.io/projected/923a5238-0877-49bc-8b92-37cab936f43f-kube-api-access-bb4k5\") pod \"glance-db-create-qvsjj\" (UID: \"923a5238-0877-49bc-8b92-37cab936f43f\") " pod="openstack/glance-db-create-qvsjj" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.142695 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/923a5238-0877-49bc-8b92-37cab936f43f-operator-scripts\") pod \"glance-db-create-qvsjj\" (UID: \"923a5238-0877-49bc-8b92-37cab936f43f\") " pod="openstack/glance-db-create-qvsjj" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.143638 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/923a5238-0877-49bc-8b92-37cab936f43f-operator-scripts\") pod \"glance-db-create-qvsjj\" (UID: \"923a5238-0877-49bc-8b92-37cab936f43f\") " pod="openstack/glance-db-create-qvsjj" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.165850 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb4k5\" (UniqueName: \"kubernetes.io/projected/923a5238-0877-49bc-8b92-37cab936f43f-kube-api-access-bb4k5\") pod \"glance-db-create-qvsjj\" (UID: \"923a5238-0877-49bc-8b92-37cab936f43f\") " pod="openstack/glance-db-create-qvsjj" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.244824 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk8ct\" (UniqueName: \"kubernetes.io/projected/23b0ce26-d151-49ef-af8e-8ca42ffe3944-kube-api-access-qk8ct\") pod \"glance-2c3e-account-create-update-smqdm\" (UID: \"23b0ce26-d151-49ef-af8e-8ca42ffe3944\") " pod="openstack/glance-2c3e-account-create-update-smqdm" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.244942 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b0ce26-d151-49ef-af8e-8ca42ffe3944-operator-scripts\") pod \"glance-2c3e-account-create-update-smqdm\" (UID: \"23b0ce26-d151-49ef-af8e-8ca42ffe3944\") " pod="openstack/glance-2c3e-account-create-update-smqdm" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.246056 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b0ce26-d151-49ef-af8e-8ca42ffe3944-operator-scripts\") pod \"glance-2c3e-account-create-update-smqdm\" (UID: \"23b0ce26-d151-49ef-af8e-8ca42ffe3944\") " pod="openstack/glance-2c3e-account-create-update-smqdm" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.268835 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk8ct\" (UniqueName: \"kubernetes.io/projected/23b0ce26-d151-49ef-af8e-8ca42ffe3944-kube-api-access-qk8ct\") pod \"glance-2c3e-account-create-update-smqdm\" (UID: \"23b0ce26-d151-49ef-af8e-8ca42ffe3944\") " pod="openstack/glance-2c3e-account-create-update-smqdm" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.268844 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qvsjj" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.268894 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.283653 4728 generic.go:334] "Generic (PLEG): container finished" podID="f6444b69-7cc1-4cbd-a266-00a9f064d649" containerID="80222ff207508a2f4cc058819bc0f6e3e14fa71a35b3a50c53731cbd8eef99b3" exitCode=0 Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.283754 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-51a7-account-create-update-l9t68" event={"ID":"f6444b69-7cc1-4cbd-a266-00a9f064d649","Type":"ContainerDied","Data":"80222ff207508a2f4cc058819bc0f6e3e14fa71a35b3a50c53731cbd8eef99b3"} Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.289165 4728 generic.go:334] "Generic (PLEG): container finished" podID="a013edbf-b6b3-46ab-b13e-a27d3ddab2c4" containerID="57cbacc438f5b3f2c03d8d28bd12b246ef6596f07422e1d8d0db89b9d552c8cc" exitCode=0 Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.289269 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bn62h" event={"ID":"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4","Type":"ContainerDied","Data":"57cbacc438f5b3f2c03d8d28bd12b246ef6596f07422e1d8d0db89b9d552c8cc"} Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.302921 4728 generic.go:334] "Generic (PLEG): container finished" podID="e0214724-b0c1-40f7-b086-6fea171a8500" containerID="94a15d80aa5b91748fc1a632d6d94a173c4f29ac38a30b09c7e1bc8e2f3f1e89" exitCode=0 Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.302998 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-657w4" event={"ID":"e0214724-b0c1-40f7-b086-6fea171a8500","Type":"ContainerDied","Data":"94a15d80aa5b91748fc1a632d6d94a173c4f29ac38a30b09c7e1bc8e2f3f1e89"} Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.305808 4728 generic.go:334] "Generic (PLEG): container finished" podID="e422ae6c-3605-4278-93aa-116a092e1f95" containerID="a66f0e0506808fcea0552504cc4edd82c8d81cb27c928d490cd7f70e6adf5538" exitCode=0 Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.305877 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dc7b-account-create-update-hk9vg" event={"ID":"e422ae6c-3605-4278-93aa-116a092e1f95","Type":"ContainerDied","Data":"a66f0e0506808fcea0552504cc4edd82c8d81cb27c928d490cd7f70e6adf5538"} Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.358386 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3e-account-create-update-smqdm" Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.398945 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w49k5"] Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.399219 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" podUID="0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" containerName="dnsmasq-dns" containerID="cri-o://7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3" gracePeriod=10 Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.908310 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:13:21 crc kubenswrapper[4728]: W1216 15:13:21.979012 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod923a5238_0877_49bc_8b92_37cab936f43f.slice/crio-fd433b5f68491003c39ab30633eaa4194c5f308cf3a6c908a6035bd40f8d1194 WatchSource:0}: Error finding container fd433b5f68491003c39ab30633eaa4194c5f308cf3a6c908a6035bd40f8d1194: Status 404 returned error can't find the container with id fd433b5f68491003c39ab30633eaa4194c5f308cf3a6c908a6035bd40f8d1194 Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.980530 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qvsjj"] Dec 16 15:13:21 crc kubenswrapper[4728]: I1216 15:13:21.989359 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2c3e-account-create-update-smqdm"] Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.067984 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-nb\") pod \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.068179 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-dns-svc\") pod \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.068283 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w7w2\" (UniqueName: \"kubernetes.io/projected/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-kube-api-access-2w7w2\") pod \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.068400 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-config\") pod \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.068547 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-sb\") pod \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.077126 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-kube-api-access-2w7w2" (OuterVolumeSpecName: "kube-api-access-2w7w2") pod "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" (UID: "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b"). InnerVolumeSpecName "kube-api-access-2w7w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.125972 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" (UID: "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.128055 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-config" (OuterVolumeSpecName: "config") pod "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" (UID: "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:22 crc kubenswrapper[4728]: E1216 15:13:22.145188 4728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-sb podName:0458151d-bfd1-4ed2-a2e5-ca8ffadef63b nodeName:}" failed. No retries permitted until 2025-12-16 15:13:22.645152292 +0000 UTC m=+983.485331286 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-sb" (UniqueName: "kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-sb") pod "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" (UID: "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b") : error deleting /var/lib/kubelet/pods/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b/volume-subpaths: remove /var/lib/kubelet/pods/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b/volume-subpaths: no such file or directory Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.146069 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" (UID: "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.171264 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.171329 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.171341 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w7w2\" (UniqueName: \"kubernetes.io/projected/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-kube-api-access-2w7w2\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.171355 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.315723 4728 generic.go:334] "Generic (PLEG): container finished" podID="0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" containerID="7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3" exitCode=0 Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.315773 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" event={"ID":"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b","Type":"ContainerDied","Data":"7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3"} Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.315833 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" event={"ID":"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b","Type":"ContainerDied","Data":"f8fca9500e055c140505ad07b29285f5eccadcca18fbbb8dea0bb5511b14753e"} Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.315840 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-w49k5" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.315858 4728 scope.go:117] "RemoveContainer" containerID="7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.317912 4728 generic.go:334] "Generic (PLEG): container finished" podID="23b0ce26-d151-49ef-af8e-8ca42ffe3944" containerID="95664df227752c69c91d23a20ba4bbb07f3ee4ae02130a06abafc0bfeffa4702" exitCode=0 Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.318000 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2c3e-account-create-update-smqdm" event={"ID":"23b0ce26-d151-49ef-af8e-8ca42ffe3944","Type":"ContainerDied","Data":"95664df227752c69c91d23a20ba4bbb07f3ee4ae02130a06abafc0bfeffa4702"} Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.318042 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2c3e-account-create-update-smqdm" event={"ID":"23b0ce26-d151-49ef-af8e-8ca42ffe3944","Type":"ContainerStarted","Data":"14400f7b41403f6d7591a8b02e3ba2b67d05577209f673e99ded1b7b88f583d0"} Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.325938 4728 generic.go:334] "Generic (PLEG): container finished" podID="923a5238-0877-49bc-8b92-37cab936f43f" containerID="68382ced612779b6fe24b054e8b1326f83ad78df158f5128fb0b837e36979e90" exitCode=0 Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.326038 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qvsjj" event={"ID":"923a5238-0877-49bc-8b92-37cab936f43f","Type":"ContainerDied","Data":"68382ced612779b6fe24b054e8b1326f83ad78df158f5128fb0b837e36979e90"} Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.326093 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qvsjj" event={"ID":"923a5238-0877-49bc-8b92-37cab936f43f","Type":"ContainerStarted","Data":"fd433b5f68491003c39ab30633eaa4194c5f308cf3a6c908a6035bd40f8d1194"} Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.477623 4728 scope.go:117] "RemoveContainer" containerID="e8381635a9b7c89dcbd4c4fec489c9df0bd667c53acb3e459bfe4f77f265d20a" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.501369 4728 scope.go:117] "RemoveContainer" containerID="7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3" Dec 16 15:13:22 crc kubenswrapper[4728]: E1216 15:13:22.501893 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3\": container with ID starting with 7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3 not found: ID does not exist" containerID="7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.501955 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3"} err="failed to get container status \"7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3\": rpc error: code = NotFound desc = could not find container \"7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3\": container with ID starting with 7cc13a13669e4c20843ddbaee36395708ae1efc029acdebd1ec4c505a6277ff3 not found: ID does not exist" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.501992 4728 scope.go:117] "RemoveContainer" containerID="e8381635a9b7c89dcbd4c4fec489c9df0bd667c53acb3e459bfe4f77f265d20a" Dec 16 15:13:22 crc kubenswrapper[4728]: E1216 15:13:22.502332 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8381635a9b7c89dcbd4c4fec489c9df0bd667c53acb3e459bfe4f77f265d20a\": container with ID starting with e8381635a9b7c89dcbd4c4fec489c9df0bd667c53acb3e459bfe4f77f265d20a not found: ID does not exist" containerID="e8381635a9b7c89dcbd4c4fec489c9df0bd667c53acb3e459bfe4f77f265d20a" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.502358 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8381635a9b7c89dcbd4c4fec489c9df0bd667c53acb3e459bfe4f77f265d20a"} err="failed to get container status \"e8381635a9b7c89dcbd4c4fec489c9df0bd667c53acb3e459bfe4f77f265d20a\": rpc error: code = NotFound desc = could not find container \"e8381635a9b7c89dcbd4c4fec489c9df0bd667c53acb3e459bfe4f77f265d20a\": container with ID starting with e8381635a9b7c89dcbd4c4fec489c9df0bd667c53acb3e459bfe4f77f265d20a not found: ID does not exist" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.681515 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-sb\") pod \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\" (UID: \"0458151d-bfd1-4ed2-a2e5-ca8ffadef63b\") " Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.683200 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" (UID: "0458151d-bfd1-4ed2-a2e5-ca8ffadef63b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.711665 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.724054 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bn62h" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.783389 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.884755 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p866v\" (UniqueName: \"kubernetes.io/projected/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-kube-api-access-p866v\") pod \"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4\" (UID: \"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4\") " Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.884943 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-operator-scripts\") pod \"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4\" (UID: \"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4\") " Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.886797 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a013edbf-b6b3-46ab-b13e-a27d3ddab2c4" (UID: "a013edbf-b6b3-46ab-b13e-a27d3ddab2c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.892122 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-kube-api-access-p866v" (OuterVolumeSpecName: "kube-api-access-p866v") pod "a013edbf-b6b3-46ab-b13e-a27d3ddab2c4" (UID: "a013edbf-b6b3-46ab-b13e-a27d3ddab2c4"). InnerVolumeSpecName "kube-api-access-p866v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.908763 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-51a7-account-create-update-l9t68" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.916744 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-657w4" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.922824 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dc7b-account-create-update-hk9vg" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.971229 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w49k5"] Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.977438 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-w49k5"] Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.986868 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:22 crc kubenswrapper[4728]: I1216 15:13:22.986897 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p866v\" (UniqueName: \"kubernetes.io/projected/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4-kube-api-access-p866v\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.088026 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e422ae6c-3605-4278-93aa-116a092e1f95-operator-scripts\") pod \"e422ae6c-3605-4278-93aa-116a092e1f95\" (UID: \"e422ae6c-3605-4278-93aa-116a092e1f95\") " Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.088085 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6444b69-7cc1-4cbd-a266-00a9f064d649-operator-scripts\") pod \"f6444b69-7cc1-4cbd-a266-00a9f064d649\" (UID: \"f6444b69-7cc1-4cbd-a266-00a9f064d649\") " Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.088121 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfhff\" (UniqueName: \"kubernetes.io/projected/e0214724-b0c1-40f7-b086-6fea171a8500-kube-api-access-zfhff\") pod \"e0214724-b0c1-40f7-b086-6fea171a8500\" (UID: \"e0214724-b0c1-40f7-b086-6fea171a8500\") " Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.088150 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghxtr\" (UniqueName: \"kubernetes.io/projected/f6444b69-7cc1-4cbd-a266-00a9f064d649-kube-api-access-ghxtr\") pod \"f6444b69-7cc1-4cbd-a266-00a9f064d649\" (UID: \"f6444b69-7cc1-4cbd-a266-00a9f064d649\") " Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.088266 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfxvq\" (UniqueName: \"kubernetes.io/projected/e422ae6c-3605-4278-93aa-116a092e1f95-kube-api-access-hfxvq\") pod \"e422ae6c-3605-4278-93aa-116a092e1f95\" (UID: \"e422ae6c-3605-4278-93aa-116a092e1f95\") " Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.088288 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0214724-b0c1-40f7-b086-6fea171a8500-operator-scripts\") pod \"e0214724-b0c1-40f7-b086-6fea171a8500\" (UID: \"e0214724-b0c1-40f7-b086-6fea171a8500\") " Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.088507 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6444b69-7cc1-4cbd-a266-00a9f064d649-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6444b69-7cc1-4cbd-a266-00a9f064d649" (UID: "f6444b69-7cc1-4cbd-a266-00a9f064d649"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.088740 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0214724-b0c1-40f7-b086-6fea171a8500-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0214724-b0c1-40f7-b086-6fea171a8500" (UID: "e0214724-b0c1-40f7-b086-6fea171a8500"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.088925 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e422ae6c-3605-4278-93aa-116a092e1f95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e422ae6c-3605-4278-93aa-116a092e1f95" (UID: "e422ae6c-3605-4278-93aa-116a092e1f95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.089021 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6444b69-7cc1-4cbd-a266-00a9f064d649-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.089040 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0214724-b0c1-40f7-b086-6fea171a8500-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.091441 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0214724-b0c1-40f7-b086-6fea171a8500-kube-api-access-zfhff" (OuterVolumeSpecName: "kube-api-access-zfhff") pod "e0214724-b0c1-40f7-b086-6fea171a8500" (UID: "e0214724-b0c1-40f7-b086-6fea171a8500"). InnerVolumeSpecName "kube-api-access-zfhff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.091652 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6444b69-7cc1-4cbd-a266-00a9f064d649-kube-api-access-ghxtr" (OuterVolumeSpecName: "kube-api-access-ghxtr") pod "f6444b69-7cc1-4cbd-a266-00a9f064d649" (UID: "f6444b69-7cc1-4cbd-a266-00a9f064d649"). InnerVolumeSpecName "kube-api-access-ghxtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.092817 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e422ae6c-3605-4278-93aa-116a092e1f95-kube-api-access-hfxvq" (OuterVolumeSpecName: "kube-api-access-hfxvq") pod "e422ae6c-3605-4278-93aa-116a092e1f95" (UID: "e422ae6c-3605-4278-93aa-116a092e1f95"). InnerVolumeSpecName "kube-api-access-hfxvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.191242 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e422ae6c-3605-4278-93aa-116a092e1f95-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.191317 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfhff\" (UniqueName: \"kubernetes.io/projected/e0214724-b0c1-40f7-b086-6fea171a8500-kube-api-access-zfhff\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.191347 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghxtr\" (UniqueName: \"kubernetes.io/projected/f6444b69-7cc1-4cbd-a266-00a9f064d649-kube-api-access-ghxtr\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.191374 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfxvq\" (UniqueName: \"kubernetes.io/projected/e422ae6c-3605-4278-93aa-116a092e1f95-kube-api-access-hfxvq\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.338752 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-51a7-account-create-update-l9t68" event={"ID":"f6444b69-7cc1-4cbd-a266-00a9f064d649","Type":"ContainerDied","Data":"0ca5a94119c67192cd2e21dd21551e5d0a5dec5dbb8d714f035302993d07dcb1"} Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.338792 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ca5a94119c67192cd2e21dd21551e5d0a5dec5dbb8d714f035302993d07dcb1" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.338863 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-51a7-account-create-update-l9t68" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.341022 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bn62h" event={"ID":"a013edbf-b6b3-46ab-b13e-a27d3ddab2c4","Type":"ContainerDied","Data":"db5626bb6098065a0cf63cd1bba3a58954d2ded2c24149ec3867cc36a31f003b"} Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.341059 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db5626bb6098065a0cf63cd1bba3a58954d2ded2c24149ec3867cc36a31f003b" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.341064 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bn62h" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.350298 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-657w4" event={"ID":"e0214724-b0c1-40f7-b086-6fea171a8500","Type":"ContainerDied","Data":"5b732bb1988bba75b553d10711c126a4ee4e37c59b90056654246f9b262e5ffb"} Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.350344 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b732bb1988bba75b553d10711c126a4ee4e37c59b90056654246f9b262e5ffb" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.350439 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-657w4" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.355146 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dc7b-account-create-update-hk9vg" event={"ID":"e422ae6c-3605-4278-93aa-116a092e1f95","Type":"ContainerDied","Data":"140b8792d275a07d534bc64bc1d9044aad960c4d09c020031af136b343e95ef1"} Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.355208 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="140b8792d275a07d534bc64bc1d9044aad960c4d09c020031af136b343e95ef1" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.355164 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dc7b-account-create-update-hk9vg" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.356800 4728 generic.go:334] "Generic (PLEG): container finished" podID="55ebd6bb-cac2-4b8f-932d-46662c011b18" containerID="e25003fa40aa710f08944965f1c59c8ec0bf77fa75cd80f0b6b429f67d158f91" exitCode=0 Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.356858 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vvql8" event={"ID":"55ebd6bb-cac2-4b8f-932d-46662c011b18","Type":"ContainerDied","Data":"e25003fa40aa710f08944965f1c59c8ec0bf77fa75cd80f0b6b429f67d158f91"} Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.515797 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" path="/var/lib/kubelet/pods/0458151d-bfd1-4ed2-a2e5-ca8ffadef63b/volumes" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.759054 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qvsjj" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.764122 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3e-account-create-update-smqdm" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.903772 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk8ct\" (UniqueName: \"kubernetes.io/projected/23b0ce26-d151-49ef-af8e-8ca42ffe3944-kube-api-access-qk8ct\") pod \"23b0ce26-d151-49ef-af8e-8ca42ffe3944\" (UID: \"23b0ce26-d151-49ef-af8e-8ca42ffe3944\") " Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.904052 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b0ce26-d151-49ef-af8e-8ca42ffe3944-operator-scripts\") pod \"23b0ce26-d151-49ef-af8e-8ca42ffe3944\" (UID: \"23b0ce26-d151-49ef-af8e-8ca42ffe3944\") " Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.904140 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb4k5\" (UniqueName: \"kubernetes.io/projected/923a5238-0877-49bc-8b92-37cab936f43f-kube-api-access-bb4k5\") pod \"923a5238-0877-49bc-8b92-37cab936f43f\" (UID: \"923a5238-0877-49bc-8b92-37cab936f43f\") " Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.904317 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/923a5238-0877-49bc-8b92-37cab936f43f-operator-scripts\") pod \"923a5238-0877-49bc-8b92-37cab936f43f\" (UID: \"923a5238-0877-49bc-8b92-37cab936f43f\") " Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.905070 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923a5238-0877-49bc-8b92-37cab936f43f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "923a5238-0877-49bc-8b92-37cab936f43f" (UID: "923a5238-0877-49bc-8b92-37cab936f43f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.905143 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b0ce26-d151-49ef-af8e-8ca42ffe3944-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23b0ce26-d151-49ef-af8e-8ca42ffe3944" (UID: "23b0ce26-d151-49ef-af8e-8ca42ffe3944"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.908259 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b0ce26-d151-49ef-af8e-8ca42ffe3944-kube-api-access-qk8ct" (OuterVolumeSpecName: "kube-api-access-qk8ct") pod "23b0ce26-d151-49ef-af8e-8ca42ffe3944" (UID: "23b0ce26-d151-49ef-af8e-8ca42ffe3944"). InnerVolumeSpecName "kube-api-access-qk8ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:23 crc kubenswrapper[4728]: I1216 15:13:23.912605 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923a5238-0877-49bc-8b92-37cab936f43f-kube-api-access-bb4k5" (OuterVolumeSpecName: "kube-api-access-bb4k5") pod "923a5238-0877-49bc-8b92-37cab936f43f" (UID: "923a5238-0877-49bc-8b92-37cab936f43f"). InnerVolumeSpecName "kube-api-access-bb4k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.006711 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk8ct\" (UniqueName: \"kubernetes.io/projected/23b0ce26-d151-49ef-af8e-8ca42ffe3944-kube-api-access-qk8ct\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.006751 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b0ce26-d151-49ef-af8e-8ca42ffe3944-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.006765 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb4k5\" (UniqueName: \"kubernetes.io/projected/923a5238-0877-49bc-8b92-37cab936f43f-kube-api-access-bb4k5\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.006777 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/923a5238-0877-49bc-8b92-37cab936f43f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.367265 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qvsjj" event={"ID":"923a5238-0877-49bc-8b92-37cab936f43f","Type":"ContainerDied","Data":"fd433b5f68491003c39ab30633eaa4194c5f308cf3a6c908a6035bd40f8d1194"} Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.367301 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qvsjj" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.367316 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd433b5f68491003c39ab30633eaa4194c5f308cf3a6c908a6035bd40f8d1194" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.369143 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2c3e-account-create-update-smqdm" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.369297 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2c3e-account-create-update-smqdm" event={"ID":"23b0ce26-d151-49ef-af8e-8ca42ffe3944","Type":"ContainerDied","Data":"14400f7b41403f6d7591a8b02e3ba2b67d05577209f673e99ded1b7b88f583d0"} Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.369379 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14400f7b41403f6d7591a8b02e3ba2b67d05577209f673e99ded1b7b88f583d0" Dec 16 15:13:24 crc kubenswrapper[4728]: E1216 15:13:24.509323 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23b0ce26_d151_49ef_af8e_8ca42ffe3944.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23b0ce26_d151_49ef_af8e_8ca42ffe3944.slice/crio-14400f7b41403f6d7591a8b02e3ba2b67d05577209f673e99ded1b7b88f583d0\": RecentStats: unable to find data in memory cache]" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.816174 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.923107 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-scripts\") pod \"55ebd6bb-cac2-4b8f-932d-46662c011b18\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.923163 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-combined-ca-bundle\") pod \"55ebd6bb-cac2-4b8f-932d-46662c011b18\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.923193 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-dispersionconf\") pod \"55ebd6bb-cac2-4b8f-932d-46662c011b18\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.923212 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-ring-data-devices\") pod \"55ebd6bb-cac2-4b8f-932d-46662c011b18\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.923243 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-swiftconf\") pod \"55ebd6bb-cac2-4b8f-932d-46662c011b18\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.923282 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55ebd6bb-cac2-4b8f-932d-46662c011b18-etc-swift\") pod \"55ebd6bb-cac2-4b8f-932d-46662c011b18\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.923356 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4847\" (UniqueName: \"kubernetes.io/projected/55ebd6bb-cac2-4b8f-932d-46662c011b18-kube-api-access-g4847\") pod \"55ebd6bb-cac2-4b8f-932d-46662c011b18\" (UID: \"55ebd6bb-cac2-4b8f-932d-46662c011b18\") " Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.924652 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "55ebd6bb-cac2-4b8f-932d-46662c011b18" (UID: "55ebd6bb-cac2-4b8f-932d-46662c011b18"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.925002 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ebd6bb-cac2-4b8f-932d-46662c011b18-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "55ebd6bb-cac2-4b8f-932d-46662c011b18" (UID: "55ebd6bb-cac2-4b8f-932d-46662c011b18"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.928058 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ebd6bb-cac2-4b8f-932d-46662c011b18-kube-api-access-g4847" (OuterVolumeSpecName: "kube-api-access-g4847") pod "55ebd6bb-cac2-4b8f-932d-46662c011b18" (UID: "55ebd6bb-cac2-4b8f-932d-46662c011b18"). InnerVolumeSpecName "kube-api-access-g4847". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.933913 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "55ebd6bb-cac2-4b8f-932d-46662c011b18" (UID: "55ebd6bb-cac2-4b8f-932d-46662c011b18"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.942562 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "55ebd6bb-cac2-4b8f-932d-46662c011b18" (UID: "55ebd6bb-cac2-4b8f-932d-46662c011b18"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.945118 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-scripts" (OuterVolumeSpecName: "scripts") pod "55ebd6bb-cac2-4b8f-932d-46662c011b18" (UID: "55ebd6bb-cac2-4b8f-932d-46662c011b18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:24 crc kubenswrapper[4728]: I1216 15:13:24.950662 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55ebd6bb-cac2-4b8f-932d-46662c011b18" (UID: "55ebd6bb-cac2-4b8f-932d-46662c011b18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:13:25 crc kubenswrapper[4728]: I1216 15:13:25.025171 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:25 crc kubenswrapper[4728]: I1216 15:13:25.025205 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:25 crc kubenswrapper[4728]: I1216 15:13:25.025217 4728 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:25 crc kubenswrapper[4728]: I1216 15:13:25.025226 4728 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55ebd6bb-cac2-4b8f-932d-46662c011b18-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:25 crc kubenswrapper[4728]: I1216 15:13:25.025234 4728 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55ebd6bb-cac2-4b8f-932d-46662c011b18-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:25 crc kubenswrapper[4728]: I1216 15:13:25.025242 4728 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55ebd6bb-cac2-4b8f-932d-46662c011b18-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:25 crc kubenswrapper[4728]: I1216 15:13:25.025253 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4847\" (UniqueName: \"kubernetes.io/projected/55ebd6bb-cac2-4b8f-932d-46662c011b18-kube-api-access-g4847\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:25 crc kubenswrapper[4728]: I1216 15:13:25.380244 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vvql8" event={"ID":"55ebd6bb-cac2-4b8f-932d-46662c011b18","Type":"ContainerDied","Data":"584e1624d0f9fcd2de0f6843381f76ff8f59c517350341d81ff229f9e5c3d8c1"} Dec 16 15:13:25 crc kubenswrapper[4728]: I1216 15:13:25.380309 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="584e1624d0f9fcd2de0f6843381f76ff8f59c517350341d81ff229f9e5c3d8c1" Dec 16 15:13:25 crc kubenswrapper[4728]: I1216 15:13:25.380361 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vvql8" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.193386 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-tnpp2"] Dec 16 15:13:26 crc kubenswrapper[4728]: E1216 15:13:26.194461 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a013edbf-b6b3-46ab-b13e-a27d3ddab2c4" containerName="mariadb-database-create" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.194589 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a013edbf-b6b3-46ab-b13e-a27d3ddab2c4" containerName="mariadb-database-create" Dec 16 15:13:26 crc kubenswrapper[4728]: E1216 15:13:26.194711 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" containerName="dnsmasq-dns" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.194807 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" containerName="dnsmasq-dns" Dec 16 15:13:26 crc kubenswrapper[4728]: E1216 15:13:26.194894 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923a5238-0877-49bc-8b92-37cab936f43f" containerName="mariadb-database-create" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.194968 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="923a5238-0877-49bc-8b92-37cab936f43f" containerName="mariadb-database-create" Dec 16 15:13:26 crc kubenswrapper[4728]: E1216 15:13:26.195056 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" containerName="init" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.195130 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" containerName="init" Dec 16 15:13:26 crc kubenswrapper[4728]: E1216 15:13:26.195225 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e422ae6c-3605-4278-93aa-116a092e1f95" containerName="mariadb-account-create-update" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.195340 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e422ae6c-3605-4278-93aa-116a092e1f95" containerName="mariadb-account-create-update" Dec 16 15:13:26 crc kubenswrapper[4728]: E1216 15:13:26.195602 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6444b69-7cc1-4cbd-a266-00a9f064d649" containerName="mariadb-account-create-update" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.195686 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6444b69-7cc1-4cbd-a266-00a9f064d649" containerName="mariadb-account-create-update" Dec 16 15:13:26 crc kubenswrapper[4728]: E1216 15:13:26.195767 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0214724-b0c1-40f7-b086-6fea171a8500" containerName="mariadb-database-create" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.195850 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0214724-b0c1-40f7-b086-6fea171a8500" containerName="mariadb-database-create" Dec 16 15:13:26 crc kubenswrapper[4728]: E1216 15:13:26.195927 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b0ce26-d151-49ef-af8e-8ca42ffe3944" containerName="mariadb-account-create-update" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.196008 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b0ce26-d151-49ef-af8e-8ca42ffe3944" containerName="mariadb-account-create-update" Dec 16 15:13:26 crc kubenswrapper[4728]: E1216 15:13:26.196089 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ebd6bb-cac2-4b8f-932d-46662c011b18" containerName="swift-ring-rebalance" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.196162 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ebd6bb-cac2-4b8f-932d-46662c011b18" containerName="swift-ring-rebalance" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.196479 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="923a5238-0877-49bc-8b92-37cab936f43f" containerName="mariadb-database-create" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.196604 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6444b69-7cc1-4cbd-a266-00a9f064d649" containerName="mariadb-account-create-update" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.196706 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e422ae6c-3605-4278-93aa-116a092e1f95" containerName="mariadb-account-create-update" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.196788 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ebd6bb-cac2-4b8f-932d-46662c011b18" containerName="swift-ring-rebalance" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.196879 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0214724-b0c1-40f7-b086-6fea171a8500" containerName="mariadb-database-create" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.196955 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b0ce26-d151-49ef-af8e-8ca42ffe3944" containerName="mariadb-account-create-update" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.197030 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="a013edbf-b6b3-46ab-b13e-a27d3ddab2c4" containerName="mariadb-database-create" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.197109 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="0458151d-bfd1-4ed2-a2e5-ca8ffadef63b" containerName="dnsmasq-dns" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.197849 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.201553 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-npj6q" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.201884 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.216657 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tnpp2"] Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.234810 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hlkkv" podUID="37c82b8b-fe2d-4265-80b1-7cdfa00e2be7" containerName="ovn-controller" probeResult="failure" output=< Dec 16 15:13:26 crc kubenswrapper[4728]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 16 15:13:26 crc kubenswrapper[4728]: > Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.251361 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.352234 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-combined-ca-bundle\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.352311 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgx5h\" (UniqueName: \"kubernetes.io/projected/316025cd-8999-4601-a3df-4aaf1dad3a83-kube-api-access-jgx5h\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.352345 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-db-sync-config-data\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.352374 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-config-data\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.454010 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-db-sync-config-data\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.455169 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-config-data\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.455568 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-combined-ca-bundle\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.455818 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgx5h\" (UniqueName: \"kubernetes.io/projected/316025cd-8999-4601-a3df-4aaf1dad3a83-kube-api-access-jgx5h\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.462869 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-db-sync-config-data\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.465550 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-config-data\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.468272 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-combined-ca-bundle\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.478303 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgx5h\" (UniqueName: \"kubernetes.io/projected/316025cd-8999-4601-a3df-4aaf1dad3a83-kube-api-access-jgx5h\") pod \"glance-db-sync-tnpp2\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:26 crc kubenswrapper[4728]: I1216 15:13:26.516891 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tnpp2" Dec 16 15:13:27 crc kubenswrapper[4728]: I1216 15:13:27.103056 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tnpp2"] Dec 16 15:13:27 crc kubenswrapper[4728]: I1216 15:13:27.395452 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tnpp2" event={"ID":"316025cd-8999-4601-a3df-4aaf1dad3a83","Type":"ContainerStarted","Data":"3e6410007c065e2e20756141fec86ba7ee543c8f0c68dc7fd654ff5a00be8214"} Dec 16 15:13:27 crc kubenswrapper[4728]: I1216 15:13:27.397018 4728 generic.go:334] "Generic (PLEG): container finished" podID="31e565e7-a84a-436e-bc5d-dc107a42ef0f" containerID="25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5" exitCode=0 Dec 16 15:13:27 crc kubenswrapper[4728]: I1216 15:13:27.397077 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31e565e7-a84a-436e-bc5d-dc107a42ef0f","Type":"ContainerDied","Data":"25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5"} Dec 16 15:13:27 crc kubenswrapper[4728]: I1216 15:13:27.399630 4728 generic.go:334] "Generic (PLEG): container finished" podID="42b12213-b2ec-4fa5-b848-d06fe7855247" containerID="8773469a391248aa723b82a38b327739121d862fabed4fa45660e65d6ebf6b43" exitCode=0 Dec 16 15:13:27 crc kubenswrapper[4728]: I1216 15:13:27.399681 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b12213-b2ec-4fa5-b848-d06fe7855247","Type":"ContainerDied","Data":"8773469a391248aa723b82a38b327739121d862fabed4fa45660e65d6ebf6b43"} Dec 16 15:13:27 crc kubenswrapper[4728]: I1216 15:13:27.980954 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:28 crc kubenswrapper[4728]: I1216 15:13:28.004060 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fc3761f8-7e22-45e1-8119-a40338b80f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"fc3761f8-7e22-45e1-8119-a40338b80f1d\") " pod="openstack/swift-storage-0" Dec 16 15:13:28 crc kubenswrapper[4728]: I1216 15:13:28.014585 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 16 15:13:28 crc kubenswrapper[4728]: I1216 15:13:28.416812 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b12213-b2ec-4fa5-b848-d06fe7855247","Type":"ContainerStarted","Data":"69d182808d85ef8d13fed9a1b0ac19d7d1bef637754fda7f1fa2fcc8415b1b1e"} Dec 16 15:13:28 crc kubenswrapper[4728]: I1216 15:13:28.417172 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 15:13:28 crc kubenswrapper[4728]: I1216 15:13:28.419202 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31e565e7-a84a-436e-bc5d-dc107a42ef0f","Type":"ContainerStarted","Data":"7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6"} Dec 16 15:13:28 crc kubenswrapper[4728]: I1216 15:13:28.419519 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:28 crc kubenswrapper[4728]: I1216 15:13:28.440341 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.971783767 podStartE2EDuration="1m4.440305663s" podCreationTimestamp="2025-12-16 15:12:24 +0000 UTC" firstStartedPulling="2025-12-16 15:12:43.126824528 +0000 UTC m=+943.967003512" lastFinishedPulling="2025-12-16 15:12:50.595346424 +0000 UTC m=+951.435525408" observedRunningTime="2025-12-16 15:13:28.438966707 +0000 UTC m=+989.279145701" watchObservedRunningTime="2025-12-16 15:13:28.440305663 +0000 UTC m=+989.280484647" Dec 16 15:13:28 crc kubenswrapper[4728]: I1216 15:13:28.480308 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.799307056 podStartE2EDuration="1m4.480287284s" podCreationTimestamp="2025-12-16 15:12:24 +0000 UTC" firstStartedPulling="2025-12-16 15:12:43.76493977 +0000 UTC m=+944.605118754" lastFinishedPulling="2025-12-16 15:12:52.445919978 +0000 UTC m=+953.286098982" observedRunningTime="2025-12-16 15:13:28.47070432 +0000 UTC m=+989.310883324" watchObservedRunningTime="2025-12-16 15:13:28.480287284 +0000 UTC m=+989.320466268" Dec 16 15:13:28 crc kubenswrapper[4728]: I1216 15:13:28.624825 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 16 15:13:29 crc kubenswrapper[4728]: I1216 15:13:29.429274 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"22f142dce1451ee3c33585b7fb7423d0f5c8ca90c718b45c2bb9be83d165c865"} Dec 16 15:13:30 crc kubenswrapper[4728]: I1216 15:13:30.437292 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"83aed1f52216349f9fe279f1cd56017c3c6c2df3ad8a54d1eeb7015450bfc93e"} Dec 16 15:13:30 crc kubenswrapper[4728]: I1216 15:13:30.437568 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"a0e8f28dcad6db4b626f3c902b2f3563f5583958cac2eb15f4ec8c7bf57f2daa"} Dec 16 15:13:30 crc kubenswrapper[4728]: I1216 15:13:30.437580 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"52d6e5486d901d647c57f18fad0428de999102a38f35f3b2a650cac98eacea0b"} Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.232340 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hlkkv" podUID="37c82b8b-fe2d-4265-80b1-7cdfa00e2be7" containerName="ovn-controller" probeResult="failure" output=< Dec 16 15:13:31 crc kubenswrapper[4728]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 16 15:13:31 crc kubenswrapper[4728]: > Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.286048 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-j4m68" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.448742 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"5d381ee5ae442e58bb114fe4de8d63e22c66aeb5d36f2a34aaa3a7eb1658c01d"} Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.501719 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hlkkv-config-m8z2n"] Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.502826 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.508026 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.539107 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hlkkv-config-m8z2n"] Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.656577 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-additional-scripts\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.656861 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-scripts\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.657470 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.657505 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-log-ovn\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.657529 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run-ovn\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.657547 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzk5c\" (UniqueName: \"kubernetes.io/projected/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-kube-api-access-fzk5c\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.759142 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-additional-scripts\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.759237 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-scripts\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.759303 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.759340 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-log-ovn\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.759379 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run-ovn\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.759398 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzk5c\" (UniqueName: \"kubernetes.io/projected/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-kube-api-access-fzk5c\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.760799 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.760918 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-additional-scripts\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.761020 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-log-ovn\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.761206 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run-ovn\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.763327 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-scripts\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.781143 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzk5c\" (UniqueName: \"kubernetes.io/projected/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-kube-api-access-fzk5c\") pod \"ovn-controller-hlkkv-config-m8z2n\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:31 crc kubenswrapper[4728]: I1216 15:13:31.833500 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:33 crc kubenswrapper[4728]: I1216 15:13:33.079183 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hlkkv-config-m8z2n"] Dec 16 15:13:33 crc kubenswrapper[4728]: I1216 15:13:33.476811 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"f5ff6d573cb0506e709c935930ac3a456f65c70b49d81df95c78918c1521eda0"} Dec 16 15:13:33 crc kubenswrapper[4728]: I1216 15:13:33.477134 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"4a2b2b0bd5b0615e66eec84be9a00477b1ab7f658a7abaeb3b98a01d345b9fe3"} Dec 16 15:13:33 crc kubenswrapper[4728]: I1216 15:13:33.482110 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlkkv-config-m8z2n" event={"ID":"be0b4d59-ea7a-453d-9bc3-7fb4122fc946","Type":"ContainerStarted","Data":"69b0e5bdf0d8ef4dfb54cd3e7b74c0ad370e04aff2f180a164b6b1887057917f"} Dec 16 15:13:34 crc kubenswrapper[4728]: I1216 15:13:34.496832 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"e3ed419f144757613627486ea06c8f9330b5185f17e00d9ab833e698f364b047"} Dec 16 15:13:34 crc kubenswrapper[4728]: I1216 15:13:34.498730 4728 generic.go:334] "Generic (PLEG): container finished" podID="be0b4d59-ea7a-453d-9bc3-7fb4122fc946" containerID="7bfa0516798c743159e966ef01cdc65ffa6700edd375c99066873e5011be5539" exitCode=0 Dec 16 15:13:34 crc kubenswrapper[4728]: I1216 15:13:34.498758 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlkkv-config-m8z2n" event={"ID":"be0b4d59-ea7a-453d-9bc3-7fb4122fc946","Type":"ContainerDied","Data":"7bfa0516798c743159e966ef01cdc65ffa6700edd375c99066873e5011be5539"} Dec 16 15:13:36 crc kubenswrapper[4728]: I1216 15:13:36.228489 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-hlkkv" Dec 16 15:13:38 crc kubenswrapper[4728]: I1216 15:13:38.819094 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:13:38 crc kubenswrapper[4728]: I1216 15:13:38.819566 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:13:45 crc kubenswrapper[4728]: I1216 15:13:45.681647 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 15:13:45 crc kubenswrapper[4728]: I1216 15:13:45.971250 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.080121 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-r77w5"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.081446 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r77w5" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.102050 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-r77w5"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.133635 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d4ab17-9896-4998-a391-38740aabe347-operator-scripts\") pod \"cinder-db-create-r77w5\" (UID: \"54d4ab17-9896-4998-a391-38740aabe347\") " pod="openstack/cinder-db-create-r77w5" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.133771 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c5jb\" (UniqueName: \"kubernetes.io/projected/54d4ab17-9896-4998-a391-38740aabe347-kube-api-access-2c5jb\") pod \"cinder-db-create-r77w5\" (UID: \"54d4ab17-9896-4998-a391-38740aabe347\") " pod="openstack/cinder-db-create-r77w5" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.171763 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-sd955"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.172962 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sd955" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.181013 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-dce7-account-create-update-56dlj"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.181923 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dce7-account-create-update-56dlj" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.186215 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.192575 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sd955"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.215887 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-dce7-account-create-update-56dlj"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.259302 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02a1be12-1801-4429-90e9-120ecaa41788-operator-scripts\") pod \"cinder-dce7-account-create-update-56dlj\" (UID: \"02a1be12-1801-4429-90e9-120ecaa41788\") " pod="openstack/cinder-dce7-account-create-update-56dlj" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.259370 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d4ab17-9896-4998-a391-38740aabe347-operator-scripts\") pod \"cinder-db-create-r77w5\" (UID: \"54d4ab17-9896-4998-a391-38740aabe347\") " pod="openstack/cinder-db-create-r77w5" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.259459 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c5jb\" (UniqueName: \"kubernetes.io/projected/54d4ab17-9896-4998-a391-38740aabe347-kube-api-access-2c5jb\") pod \"cinder-db-create-r77w5\" (UID: \"54d4ab17-9896-4998-a391-38740aabe347\") " pod="openstack/cinder-db-create-r77w5" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.259514 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-operator-scripts\") pod \"barbican-db-create-sd955\" (UID: \"da29f3ca-e4e7-4f01-9dfa-315a928c25c3\") " pod="openstack/barbican-db-create-sd955" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.259543 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk8jt\" (UniqueName: \"kubernetes.io/projected/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-kube-api-access-gk8jt\") pod \"barbican-db-create-sd955\" (UID: \"da29f3ca-e4e7-4f01-9dfa-315a928c25c3\") " pod="openstack/barbican-db-create-sd955" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.259607 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hspf\" (UniqueName: \"kubernetes.io/projected/02a1be12-1801-4429-90e9-120ecaa41788-kube-api-access-6hspf\") pod \"cinder-dce7-account-create-update-56dlj\" (UID: \"02a1be12-1801-4429-90e9-120ecaa41788\") " pod="openstack/cinder-dce7-account-create-update-56dlj" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.266619 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d4ab17-9896-4998-a391-38740aabe347-operator-scripts\") pod \"cinder-db-create-r77w5\" (UID: \"54d4ab17-9896-4998-a391-38740aabe347\") " pod="openstack/cinder-db-create-r77w5" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.299912 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-85fc-account-create-update-6fm7d"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.300873 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-85fc-account-create-update-6fm7d" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.302855 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.312327 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c5jb\" (UniqueName: \"kubernetes.io/projected/54d4ab17-9896-4998-a391-38740aabe347-kube-api-access-2c5jb\") pod \"cinder-db-create-r77w5\" (UID: \"54d4ab17-9896-4998-a391-38740aabe347\") " pod="openstack/cinder-db-create-r77w5" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.343510 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-85fc-account-create-update-6fm7d"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.351194 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2lb8c"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.352283 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.356050 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.356379 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.356622 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.356671 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9cbd6" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.361251 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-operator-scripts\") pod \"barbican-db-create-sd955\" (UID: \"da29f3ca-e4e7-4f01-9dfa-315a928c25c3\") " pod="openstack/barbican-db-create-sd955" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.361286 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk8jt\" (UniqueName: \"kubernetes.io/projected/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-kube-api-access-gk8jt\") pod \"barbican-db-create-sd955\" (UID: \"da29f3ca-e4e7-4f01-9dfa-315a928c25c3\") " pod="openstack/barbican-db-create-sd955" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.361334 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hspf\" (UniqueName: \"kubernetes.io/projected/02a1be12-1801-4429-90e9-120ecaa41788-kube-api-access-6hspf\") pod \"cinder-dce7-account-create-update-56dlj\" (UID: \"02a1be12-1801-4429-90e9-120ecaa41788\") " pod="openstack/cinder-dce7-account-create-update-56dlj" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.361399 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02a1be12-1801-4429-90e9-120ecaa41788-operator-scripts\") pod \"cinder-dce7-account-create-update-56dlj\" (UID: \"02a1be12-1801-4429-90e9-120ecaa41788\") " pod="openstack/cinder-dce7-account-create-update-56dlj" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.362420 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-operator-scripts\") pod \"barbican-db-create-sd955\" (UID: \"da29f3ca-e4e7-4f01-9dfa-315a928c25c3\") " pod="openstack/barbican-db-create-sd955" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.362886 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02a1be12-1801-4429-90e9-120ecaa41788-operator-scripts\") pod \"cinder-dce7-account-create-update-56dlj\" (UID: \"02a1be12-1801-4429-90e9-120ecaa41788\") " pod="openstack/cinder-dce7-account-create-update-56dlj" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.368262 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2lb8c"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.384372 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk8jt\" (UniqueName: \"kubernetes.io/projected/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-kube-api-access-gk8jt\") pod \"barbican-db-create-sd955\" (UID: \"da29f3ca-e4e7-4f01-9dfa-315a928c25c3\") " pod="openstack/barbican-db-create-sd955" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.390118 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hspf\" (UniqueName: \"kubernetes.io/projected/02a1be12-1801-4429-90e9-120ecaa41788-kube-api-access-6hspf\") pod \"cinder-dce7-account-create-update-56dlj\" (UID: \"02a1be12-1801-4429-90e9-120ecaa41788\") " pod="openstack/cinder-dce7-account-create-update-56dlj" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.398527 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r77w5" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.399424 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-t2ct9"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.401608 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t2ct9" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.433102 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-t2ct9"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.464246 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngc9h\" (UniqueName: \"kubernetes.io/projected/f19c009d-8b36-4b96-9995-541e097b4f21-kube-api-access-ngc9h\") pod \"keystone-db-sync-2lb8c\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.464371 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-config-data\") pod \"keystone-db-sync-2lb8c\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.464430 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-operator-scripts\") pod \"barbican-85fc-account-create-update-6fm7d\" (UID: \"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f\") " pod="openstack/barbican-85fc-account-create-update-6fm7d" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.464492 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-combined-ca-bundle\") pod \"keystone-db-sync-2lb8c\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.464556 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlz6b\" (UniqueName: \"kubernetes.io/projected/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-kube-api-access-rlz6b\") pod \"barbican-85fc-account-create-update-6fm7d\" (UID: \"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f\") " pod="openstack/barbican-85fc-account-create-update-6fm7d" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.500567 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sd955" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.515219 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dce7-account-create-update-56dlj" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.565800 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-combined-ca-bundle\") pod \"keystone-db-sync-2lb8c\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.565868 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlz6b\" (UniqueName: \"kubernetes.io/projected/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-kube-api-access-rlz6b\") pod \"barbican-85fc-account-create-update-6fm7d\" (UID: \"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f\") " pod="openstack/barbican-85fc-account-create-update-6fm7d" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.565917 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngc9h\" (UniqueName: \"kubernetes.io/projected/f19c009d-8b36-4b96-9995-541e097b4f21-kube-api-access-ngc9h\") pod \"keystone-db-sync-2lb8c\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.565997 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-config-data\") pod \"keystone-db-sync-2lb8c\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.566025 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-operator-scripts\") pod \"barbican-85fc-account-create-update-6fm7d\" (UID: \"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f\") " pod="openstack/barbican-85fc-account-create-update-6fm7d" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.566047 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4129cf7-bc79-4961-aed2-ff704fd4c29e-operator-scripts\") pod \"neutron-db-create-t2ct9\" (UID: \"b4129cf7-bc79-4961-aed2-ff704fd4c29e\") " pod="openstack/neutron-db-create-t2ct9" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.566072 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzxpc\" (UniqueName: \"kubernetes.io/projected/b4129cf7-bc79-4961-aed2-ff704fd4c29e-kube-api-access-dzxpc\") pod \"neutron-db-create-t2ct9\" (UID: \"b4129cf7-bc79-4961-aed2-ff704fd4c29e\") " pod="openstack/neutron-db-create-t2ct9" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.566745 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-operator-scripts\") pod \"barbican-85fc-account-create-update-6fm7d\" (UID: \"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f\") " pod="openstack/barbican-85fc-account-create-update-6fm7d" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.568810 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-combined-ca-bundle\") pod \"keystone-db-sync-2lb8c\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.573920 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-config-data\") pod \"keystone-db-sync-2lb8c\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.582625 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlz6b\" (UniqueName: \"kubernetes.io/projected/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-kube-api-access-rlz6b\") pod \"barbican-85fc-account-create-update-6fm7d\" (UID: \"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f\") " pod="openstack/barbican-85fc-account-create-update-6fm7d" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.586841 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngc9h\" (UniqueName: \"kubernetes.io/projected/f19c009d-8b36-4b96-9995-541e097b4f21-kube-api-access-ngc9h\") pod \"keystone-db-sync-2lb8c\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.639692 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-85fc-account-create-update-6fm7d" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.667515 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4129cf7-bc79-4961-aed2-ff704fd4c29e-operator-scripts\") pod \"neutron-db-create-t2ct9\" (UID: \"b4129cf7-bc79-4961-aed2-ff704fd4c29e\") " pod="openstack/neutron-db-create-t2ct9" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.667564 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzxpc\" (UniqueName: \"kubernetes.io/projected/b4129cf7-bc79-4961-aed2-ff704fd4c29e-kube-api-access-dzxpc\") pod \"neutron-db-create-t2ct9\" (UID: \"b4129cf7-bc79-4961-aed2-ff704fd4c29e\") " pod="openstack/neutron-db-create-t2ct9" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.669421 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4129cf7-bc79-4961-aed2-ff704fd4c29e-operator-scripts\") pod \"neutron-db-create-t2ct9\" (UID: \"b4129cf7-bc79-4961-aed2-ff704fd4c29e\") " pod="openstack/neutron-db-create-t2ct9" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.677252 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.692981 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzxpc\" (UniqueName: \"kubernetes.io/projected/b4129cf7-bc79-4961-aed2-ff704fd4c29e-kube-api-access-dzxpc\") pod \"neutron-db-create-t2ct9\" (UID: \"b4129cf7-bc79-4961-aed2-ff704fd4c29e\") " pod="openstack/neutron-db-create-t2ct9" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.747103 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t2ct9" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.827211 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fb5d-account-create-update-vx5xh"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.828671 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb5d-account-create-update-vx5xh" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.830678 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.837997 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fb5d-account-create-update-vx5xh"] Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.972538 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-operator-scripts\") pod \"neutron-fb5d-account-create-update-vx5xh\" (UID: \"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e\") " pod="openstack/neutron-fb5d-account-create-update-vx5xh" Dec 16 15:13:46 crc kubenswrapper[4728]: I1216 15:13:46.972716 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fntgt\" (UniqueName: \"kubernetes.io/projected/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-kube-api-access-fntgt\") pod \"neutron-fb5d-account-create-update-vx5xh\" (UID: \"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e\") " pod="openstack/neutron-fb5d-account-create-update-vx5xh" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.074278 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fntgt\" (UniqueName: \"kubernetes.io/projected/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-kube-api-access-fntgt\") pod \"neutron-fb5d-account-create-update-vx5xh\" (UID: \"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e\") " pod="openstack/neutron-fb5d-account-create-update-vx5xh" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.074376 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-operator-scripts\") pod \"neutron-fb5d-account-create-update-vx5xh\" (UID: \"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e\") " pod="openstack/neutron-fb5d-account-create-update-vx5xh" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.075265 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-operator-scripts\") pod \"neutron-fb5d-account-create-update-vx5xh\" (UID: \"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e\") " pod="openstack/neutron-fb5d-account-create-update-vx5xh" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.094954 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fntgt\" (UniqueName: \"kubernetes.io/projected/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-kube-api-access-fntgt\") pod \"neutron-fb5d-account-create-update-vx5xh\" (UID: \"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e\") " pod="openstack/neutron-fb5d-account-create-update-vx5xh" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.142253 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb5d-account-create-update-vx5xh" Dec 16 15:13:47 crc kubenswrapper[4728]: E1216 15:13:47.585626 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 16 15:13:47 crc kubenswrapper[4728]: E1216 15:13:47.585800 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jgx5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-tnpp2_openstack(316025cd-8999-4601-a3df-4aaf1dad3a83): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:13:47 crc kubenswrapper[4728]: E1216 15:13:47.589458 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-tnpp2" podUID="316025cd-8999-4601-a3df-4aaf1dad3a83" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.669150 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hlkkv-config-m8z2n" event={"ID":"be0b4d59-ea7a-453d-9bc3-7fb4122fc946","Type":"ContainerDied","Data":"69b0e5bdf0d8ef4dfb54cd3e7b74c0ad370e04aff2f180a164b6b1887057917f"} Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.669213 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69b0e5bdf0d8ef4dfb54cd3e7b74c0ad370e04aff2f180a164b6b1887057917f" Dec 16 15:13:47 crc kubenswrapper[4728]: E1216 15:13:47.673590 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-tnpp2" podUID="316025cd-8999-4601-a3df-4aaf1dad3a83" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.675603 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.789910 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-log-ovn\") pod \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.790276 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run\") pod \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.790317 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzk5c\" (UniqueName: \"kubernetes.io/projected/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-kube-api-access-fzk5c\") pod \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.790449 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-scripts\") pod \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.790491 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-additional-scripts\") pod \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.790564 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run-ovn\") pod \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\" (UID: \"be0b4d59-ea7a-453d-9bc3-7fb4122fc946\") " Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.791422 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "be0b4d59-ea7a-453d-9bc3-7fb4122fc946" (UID: "be0b4d59-ea7a-453d-9bc3-7fb4122fc946"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.791449 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run" (OuterVolumeSpecName: "var-run") pod "be0b4d59-ea7a-453d-9bc3-7fb4122fc946" (UID: "be0b4d59-ea7a-453d-9bc3-7fb4122fc946"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.792502 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "be0b4d59-ea7a-453d-9bc3-7fb4122fc946" (UID: "be0b4d59-ea7a-453d-9bc3-7fb4122fc946"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.792634 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "be0b4d59-ea7a-453d-9bc3-7fb4122fc946" (UID: "be0b4d59-ea7a-453d-9bc3-7fb4122fc946"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.792641 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-scripts" (OuterVolumeSpecName: "scripts") pod "be0b4d59-ea7a-453d-9bc3-7fb4122fc946" (UID: "be0b4d59-ea7a-453d-9bc3-7fb4122fc946"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.799962 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-kube-api-access-fzk5c" (OuterVolumeSpecName: "kube-api-access-fzk5c") pod "be0b4d59-ea7a-453d-9bc3-7fb4122fc946" (UID: "be0b4d59-ea7a-453d-9bc3-7fb4122fc946"). InnerVolumeSpecName "kube-api-access-fzk5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.892627 4728 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.892663 4728 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.892674 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzk5c\" (UniqueName: \"kubernetes.io/projected/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-kube-api-access-fzk5c\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.892683 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.892690 4728 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:47 crc kubenswrapper[4728]: I1216 15:13:47.892699 4728 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be0b4d59-ea7a-453d-9bc3-7fb4122fc946-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.337263 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2lb8c"] Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.414245 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-85fc-account-create-update-6fm7d"] Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.425823 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fb5d-account-create-update-vx5xh"] Dec 16 15:13:48 crc kubenswrapper[4728]: W1216 15:13:48.432786 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c5fa45f_6924_4aca_b07f_f7a26af9ae1e.slice/crio-a4e1c639abc22ec0fe215f3c4178c9bb4d1987964915ed42482ad5f1bed78cc3 WatchSource:0}: Error finding container a4e1c639abc22ec0fe215f3c4178c9bb4d1987964915ed42482ad5f1bed78cc3: Status 404 returned error can't find the container with id a4e1c639abc22ec0fe215f3c4178c9bb4d1987964915ed42482ad5f1bed78cc3 Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.439795 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-t2ct9"] Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.447477 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-r77w5"] Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.452574 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sd955"] Dec 16 15:13:48 crc kubenswrapper[4728]: W1216 15:13:48.471730 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda29f3ca_e4e7_4f01_9dfa_315a928c25c3.slice/crio-bef2c92898a58bdf7f2c12d9f526bc3495bf866aa7e8dad0bb9f48efd3889011 WatchSource:0}: Error finding container bef2c92898a58bdf7f2c12d9f526bc3495bf866aa7e8dad0bb9f48efd3889011: Status 404 returned error can't find the container with id bef2c92898a58bdf7f2c12d9f526bc3495bf866aa7e8dad0bb9f48efd3889011 Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.617476 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-dce7-account-create-update-56dlj"] Dec 16 15:13:48 crc kubenswrapper[4728]: W1216 15:13:48.625826 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02a1be12_1801_4429_90e9_120ecaa41788.slice/crio-f0def956cdb2f98fb11ee3bf48d3955af09e5ffc54809e9eaa916a44c67bb4f2 WatchSource:0}: Error finding container f0def956cdb2f98fb11ee3bf48d3955af09e5ffc54809e9eaa916a44c67bb4f2: Status 404 returned error can't find the container with id f0def956cdb2f98fb11ee3bf48d3955af09e5ffc54809e9eaa916a44c67bb4f2 Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.675934 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t2ct9" event={"ID":"b4129cf7-bc79-4961-aed2-ff704fd4c29e","Type":"ContainerStarted","Data":"a90cacd3d2f0e9a3bbf2fc3f40ef20d99e7ee5a866bcb447ac86090262d76dfc"} Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.676977 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r77w5" event={"ID":"54d4ab17-9896-4998-a391-38740aabe347","Type":"ContainerStarted","Data":"a91d5fa4e65db4064a5216ddda28a2ba79472c7758ccc87883cd6e414e6615f3"} Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.690892 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"951f19ad6f06746bdd57490ddbc5689991d608b89f5db5ae16c16e1a29a15123"} Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.691911 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb5d-account-create-update-vx5xh" event={"ID":"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e","Type":"ContainerStarted","Data":"a4e1c639abc22ec0fe215f3c4178c9bb4d1987964915ed42482ad5f1bed78cc3"} Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.693788 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dce7-account-create-update-56dlj" event={"ID":"02a1be12-1801-4429-90e9-120ecaa41788","Type":"ContainerStarted","Data":"f0def956cdb2f98fb11ee3bf48d3955af09e5ffc54809e9eaa916a44c67bb4f2"} Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.695748 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-85fc-account-create-update-6fm7d" event={"ID":"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f","Type":"ContainerStarted","Data":"f12d92d63a7486923f16254014440521fbe7f597d44de65db845d2b8a0cd0b15"} Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.697523 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2lb8c" event={"ID":"f19c009d-8b36-4b96-9995-541e097b4f21","Type":"ContainerStarted","Data":"af09a250631b63d294e0db2752b3fd008ba329a8a112694570d03d29d44fbaef"} Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.699232 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sd955" event={"ID":"da29f3ca-e4e7-4f01-9dfa-315a928c25c3","Type":"ContainerStarted","Data":"bef2c92898a58bdf7f2c12d9f526bc3495bf866aa7e8dad0bb9f48efd3889011"} Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.699282 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hlkkv-config-m8z2n" Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.823577 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hlkkv-config-m8z2n"] Dec 16 15:13:48 crc kubenswrapper[4728]: I1216 15:13:48.825755 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hlkkv-config-m8z2n"] Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.525725 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be0b4d59-ea7a-453d-9bc3-7fb4122fc946" path="/var/lib/kubelet/pods/be0b4d59-ea7a-453d-9bc3-7fb4122fc946/volumes" Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.709169 4728 generic.go:334] "Generic (PLEG): container finished" podID="b4129cf7-bc79-4961-aed2-ff704fd4c29e" containerID="77c340523d1a2a994992f73c317da8c4d736bf12a1ed83d0bf7c952fcbf52056" exitCode=0 Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.709277 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t2ct9" event={"ID":"b4129cf7-bc79-4961-aed2-ff704fd4c29e","Type":"ContainerDied","Data":"77c340523d1a2a994992f73c317da8c4d736bf12a1ed83d0bf7c952fcbf52056"} Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.710666 4728 generic.go:334] "Generic (PLEG): container finished" podID="54d4ab17-9896-4998-a391-38740aabe347" containerID="7c48a120c584a8485027a015d3c0a95f2354bd4cc842712c235d869ea07a58f5" exitCode=0 Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.710735 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r77w5" event={"ID":"54d4ab17-9896-4998-a391-38740aabe347","Type":"ContainerDied","Data":"7c48a120c584a8485027a015d3c0a95f2354bd4cc842712c235d869ea07a58f5"} Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.712496 4728 generic.go:334] "Generic (PLEG): container finished" podID="7c5fa45f-6924-4aca-b07f-f7a26af9ae1e" containerID="954de75c25c3e8b73e274320be59e5cba4d0b6b2868b9871771c32edc5076599" exitCode=0 Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.712576 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb5d-account-create-update-vx5xh" event={"ID":"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e","Type":"ContainerDied","Data":"954de75c25c3e8b73e274320be59e5cba4d0b6b2868b9871771c32edc5076599"} Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.714277 4728 generic.go:334] "Generic (PLEG): container finished" podID="02a1be12-1801-4429-90e9-120ecaa41788" containerID="7d312bb8d2de0d5f58e3774229fc478c86b781322562756eb684719c73a34bd7" exitCode=0 Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.714332 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dce7-account-create-update-56dlj" event={"ID":"02a1be12-1801-4429-90e9-120ecaa41788","Type":"ContainerDied","Data":"7d312bb8d2de0d5f58e3774229fc478c86b781322562756eb684719c73a34bd7"} Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.716923 4728 generic.go:334] "Generic (PLEG): container finished" podID="78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f" containerID="21f7766c39bdd705e0930000b54ad8e4bd553d9ddfe515e2edacec61d691730f" exitCode=0 Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.716967 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-85fc-account-create-update-6fm7d" event={"ID":"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f","Type":"ContainerDied","Data":"21f7766c39bdd705e0930000b54ad8e4bd553d9ddfe515e2edacec61d691730f"} Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.733006 4728 generic.go:334] "Generic (PLEG): container finished" podID="da29f3ca-e4e7-4f01-9dfa-315a928c25c3" containerID="fcb5e84498f3a1fc8efe49f0c84566a9af16398357e3de61decf5a732c4cd70b" exitCode=0 Dec 16 15:13:49 crc kubenswrapper[4728]: I1216 15:13:49.733063 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sd955" event={"ID":"da29f3ca-e4e7-4f01-9dfa-315a928c25c3","Type":"ContainerDied","Data":"fcb5e84498f3a1fc8efe49f0c84566a9af16398357e3de61decf5a732c4cd70b"} Dec 16 15:13:50 crc kubenswrapper[4728]: I1216 15:13:50.810311 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"0caa1498f220648c025e03a8056941c008e8bf2cc44e2a95743db8e555e81a2e"} Dec 16 15:13:50 crc kubenswrapper[4728]: I1216 15:13:50.810675 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"eacc737035e36ab2019f3666b2943fd51d98b0025100cf6c33b3535c6ae0d1e0"} Dec 16 15:13:50 crc kubenswrapper[4728]: I1216 15:13:50.810697 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"b60ec8422fffbe19c5fae25d370c4ffe1642c37170dbed2ceabccc428defc1cb"} Dec 16 15:13:50 crc kubenswrapper[4728]: I1216 15:13:50.810709 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"2d4fe38cd1daaa1362d07d133c8ec6bfcfdbcf30ae3a9068df5c15139efa0389"} Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.853127 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-85fc-account-create-update-6fm7d" event={"ID":"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f","Type":"ContainerDied","Data":"f12d92d63a7486923f16254014440521fbe7f597d44de65db845d2b8a0cd0b15"} Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.853636 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f12d92d63a7486923f16254014440521fbe7f597d44de65db845d2b8a0cd0b15" Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.854681 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sd955" event={"ID":"da29f3ca-e4e7-4f01-9dfa-315a928c25c3","Type":"ContainerDied","Data":"bef2c92898a58bdf7f2c12d9f526bc3495bf866aa7e8dad0bb9f48efd3889011"} Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.854720 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bef2c92898a58bdf7f2c12d9f526bc3495bf866aa7e8dad0bb9f48efd3889011" Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.856026 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t2ct9" event={"ID":"b4129cf7-bc79-4961-aed2-ff704fd4c29e","Type":"ContainerDied","Data":"a90cacd3d2f0e9a3bbf2fc3f40ef20d99e7ee5a866bcb447ac86090262d76dfc"} Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.856057 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90cacd3d2f0e9a3bbf2fc3f40ef20d99e7ee5a866bcb447ac86090262d76dfc" Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.857929 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r77w5" event={"ID":"54d4ab17-9896-4998-a391-38740aabe347","Type":"ContainerDied","Data":"a91d5fa4e65db4064a5216ddda28a2ba79472c7758ccc87883cd6e414e6615f3"} Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.857971 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a91d5fa4e65db4064a5216ddda28a2ba79472c7758ccc87883cd6e414e6615f3" Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.862518 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb5d-account-create-update-vx5xh" event={"ID":"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e","Type":"ContainerDied","Data":"a4e1c639abc22ec0fe215f3c4178c9bb4d1987964915ed42482ad5f1bed78cc3"} Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.862602 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4e1c639abc22ec0fe215f3c4178c9bb4d1987964915ed42482ad5f1bed78cc3" Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.864197 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dce7-account-create-update-56dlj" event={"ID":"02a1be12-1801-4429-90e9-120ecaa41788","Type":"ContainerDied","Data":"f0def956cdb2f98fb11ee3bf48d3955af09e5ffc54809e9eaa916a44c67bb4f2"} Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.864225 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0def956cdb2f98fb11ee3bf48d3955af09e5ffc54809e9eaa916a44c67bb4f2" Dec 16 15:13:54 crc kubenswrapper[4728]: I1216 15:13:54.990751 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t2ct9" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.062783 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sd955" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.074222 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r77w5" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.084329 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-85fc-account-create-update-6fm7d" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.118702 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb5d-account-create-update-vx5xh" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.137393 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dce7-account-create-update-56dlj" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.169468 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c5jb\" (UniqueName: \"kubernetes.io/projected/54d4ab17-9896-4998-a391-38740aabe347-kube-api-access-2c5jb\") pod \"54d4ab17-9896-4998-a391-38740aabe347\" (UID: \"54d4ab17-9896-4998-a391-38740aabe347\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.169756 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4129cf7-bc79-4961-aed2-ff704fd4c29e-operator-scripts\") pod \"b4129cf7-bc79-4961-aed2-ff704fd4c29e\" (UID: \"b4129cf7-bc79-4961-aed2-ff704fd4c29e\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.169910 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-operator-scripts\") pod \"da29f3ca-e4e7-4f01-9dfa-315a928c25c3\" (UID: \"da29f3ca-e4e7-4f01-9dfa-315a928c25c3\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.170349 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk8jt\" (UniqueName: \"kubernetes.io/projected/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-kube-api-access-gk8jt\") pod \"da29f3ca-e4e7-4f01-9dfa-315a928c25c3\" (UID: \"da29f3ca-e4e7-4f01-9dfa-315a928c25c3\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.170530 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzxpc\" (UniqueName: \"kubernetes.io/projected/b4129cf7-bc79-4961-aed2-ff704fd4c29e-kube-api-access-dzxpc\") pod \"b4129cf7-bc79-4961-aed2-ff704fd4c29e\" (UID: \"b4129cf7-bc79-4961-aed2-ff704fd4c29e\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.170631 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d4ab17-9896-4998-a391-38740aabe347-operator-scripts\") pod \"54d4ab17-9896-4998-a391-38740aabe347\" (UID: \"54d4ab17-9896-4998-a391-38740aabe347\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.170234 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4129cf7-bc79-4961-aed2-ff704fd4c29e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4129cf7-bc79-4961-aed2-ff704fd4c29e" (UID: "b4129cf7-bc79-4961-aed2-ff704fd4c29e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.170287 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da29f3ca-e4e7-4f01-9dfa-315a928c25c3" (UID: "da29f3ca-e4e7-4f01-9dfa-315a928c25c3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.171461 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54d4ab17-9896-4998-a391-38740aabe347-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54d4ab17-9896-4998-a391-38740aabe347" (UID: "54d4ab17-9896-4998-a391-38740aabe347"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.173713 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d4ab17-9896-4998-a391-38740aabe347-kube-api-access-2c5jb" (OuterVolumeSpecName: "kube-api-access-2c5jb") pod "54d4ab17-9896-4998-a391-38740aabe347" (UID: "54d4ab17-9896-4998-a391-38740aabe347"). InnerVolumeSpecName "kube-api-access-2c5jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.175899 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4129cf7-bc79-4961-aed2-ff704fd4c29e-kube-api-access-dzxpc" (OuterVolumeSpecName: "kube-api-access-dzxpc") pod "b4129cf7-bc79-4961-aed2-ff704fd4c29e" (UID: "b4129cf7-bc79-4961-aed2-ff704fd4c29e"). InnerVolumeSpecName "kube-api-access-dzxpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.176679 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-kube-api-access-gk8jt" (OuterVolumeSpecName: "kube-api-access-gk8jt") pod "da29f3ca-e4e7-4f01-9dfa-315a928c25c3" (UID: "da29f3ca-e4e7-4f01-9dfa-315a928c25c3"). InnerVolumeSpecName "kube-api-access-gk8jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.271832 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-operator-scripts\") pod \"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f\" (UID: \"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.271885 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlz6b\" (UniqueName: \"kubernetes.io/projected/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-kube-api-access-rlz6b\") pod \"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f\" (UID: \"78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.271996 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fntgt\" (UniqueName: \"kubernetes.io/projected/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-kube-api-access-fntgt\") pod \"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e\" (UID: \"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.272030 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hspf\" (UniqueName: \"kubernetes.io/projected/02a1be12-1801-4429-90e9-120ecaa41788-kube-api-access-6hspf\") pod \"02a1be12-1801-4429-90e9-120ecaa41788\" (UID: \"02a1be12-1801-4429-90e9-120ecaa41788\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.272061 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-operator-scripts\") pod \"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e\" (UID: \"7c5fa45f-6924-4aca-b07f-f7a26af9ae1e\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.272155 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02a1be12-1801-4429-90e9-120ecaa41788-operator-scripts\") pod \"02a1be12-1801-4429-90e9-120ecaa41788\" (UID: \"02a1be12-1801-4429-90e9-120ecaa41788\") " Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.272491 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzxpc\" (UniqueName: \"kubernetes.io/projected/b4129cf7-bc79-4961-aed2-ff704fd4c29e-kube-api-access-dzxpc\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.272513 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54d4ab17-9896-4998-a391-38740aabe347-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.272522 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c5jb\" (UniqueName: \"kubernetes.io/projected/54d4ab17-9896-4998-a391-38740aabe347-kube-api-access-2c5jb\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.272532 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4129cf7-bc79-4961-aed2-ff704fd4c29e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.272540 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.272549 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk8jt\" (UniqueName: \"kubernetes.io/projected/da29f3ca-e4e7-4f01-9dfa-315a928c25c3-kube-api-access-gk8jt\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.273835 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f" (UID: "78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.273887 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02a1be12-1801-4429-90e9-120ecaa41788-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02a1be12-1801-4429-90e9-120ecaa41788" (UID: "02a1be12-1801-4429-90e9-120ecaa41788"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.273950 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c5fa45f-6924-4aca-b07f-f7a26af9ae1e" (UID: "7c5fa45f-6924-4aca-b07f-f7a26af9ae1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.278842 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-kube-api-access-rlz6b" (OuterVolumeSpecName: "kube-api-access-rlz6b") pod "78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f" (UID: "78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f"). InnerVolumeSpecName "kube-api-access-rlz6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.278872 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-kube-api-access-fntgt" (OuterVolumeSpecName: "kube-api-access-fntgt") pod "7c5fa45f-6924-4aca-b07f-f7a26af9ae1e" (UID: "7c5fa45f-6924-4aca-b07f-f7a26af9ae1e"). InnerVolumeSpecName "kube-api-access-fntgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.285516 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a1be12-1801-4429-90e9-120ecaa41788-kube-api-access-6hspf" (OuterVolumeSpecName: "kube-api-access-6hspf") pod "02a1be12-1801-4429-90e9-120ecaa41788" (UID: "02a1be12-1801-4429-90e9-120ecaa41788"). InnerVolumeSpecName "kube-api-access-6hspf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.374220 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02a1be12-1801-4429-90e9-120ecaa41788-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.374251 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.374260 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlz6b\" (UniqueName: \"kubernetes.io/projected/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f-kube-api-access-rlz6b\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.374270 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fntgt\" (UniqueName: \"kubernetes.io/projected/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-kube-api-access-fntgt\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.374298 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hspf\" (UniqueName: \"kubernetes.io/projected/02a1be12-1801-4429-90e9-120ecaa41788-kube-api-access-6hspf\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.374307 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.870833 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2lb8c" event={"ID":"f19c009d-8b36-4b96-9995-541e097b4f21","Type":"ContainerStarted","Data":"2b79213d8f79d2599b4b96667c88887812fa01ba79424fd6e8768af991ad3dc7"} Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.884023 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r77w5" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.885538 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"3fd58ace40cb8ea2caef4a1f9953482182572f94ca27d18a5a56c7fa0980a9dc"} Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.885569 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"ea987d9e02a5c8c83a3012760028f3741e412d603da776228cbfd53bf64ce6ba"} Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.885581 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fc3761f8-7e22-45e1-8119-a40338b80f1d","Type":"ContainerStarted","Data":"a4ac5b4fd3777112315492f8ca25f010d395a24d2959b2427fcb3871c658ec1d"} Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.885624 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-85fc-account-create-update-6fm7d" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.885951 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb5d-account-create-update-vx5xh" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.886245 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dce7-account-create-update-56dlj" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.886586 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t2ct9" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.886618 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sd955" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.899653 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2lb8c" podStartSLOduration=3.374337605 podStartE2EDuration="9.899634099s" podCreationTimestamp="2025-12-16 15:13:46 +0000 UTC" firstStartedPulling="2025-12-16 15:13:48.366897337 +0000 UTC m=+1009.207076321" lastFinishedPulling="2025-12-16 15:13:54.892193831 +0000 UTC m=+1015.732372815" observedRunningTime="2025-12-16 15:13:55.889285364 +0000 UTC m=+1016.729464348" watchObservedRunningTime="2025-12-16 15:13:55.899634099 +0000 UTC m=+1016.739813083" Dec 16 15:13:55 crc kubenswrapper[4728]: I1216 15:13:55.928975 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=24.650016573 podStartE2EDuration="45.928953417s" podCreationTimestamp="2025-12-16 15:13:10 +0000 UTC" firstStartedPulling="2025-12-16 15:13:28.636701129 +0000 UTC m=+989.476880113" lastFinishedPulling="2025-12-16 15:13:49.915637973 +0000 UTC m=+1010.755816957" observedRunningTime="2025-12-16 15:13:55.921971562 +0000 UTC m=+1016.762150556" watchObservedRunningTime="2025-12-16 15:13:55.928953417 +0000 UTC m=+1016.769132421" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.193507 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-mqtfr"] Dec 16 15:13:56 crc kubenswrapper[4728]: E1216 15:13:56.193901 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da29f3ca-e4e7-4f01-9dfa-315a928c25c3" containerName="mariadb-database-create" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.193925 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="da29f3ca-e4e7-4f01-9dfa-315a928c25c3" containerName="mariadb-database-create" Dec 16 15:13:56 crc kubenswrapper[4728]: E1216 15:13:56.193939 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d4ab17-9896-4998-a391-38740aabe347" containerName="mariadb-database-create" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.193947 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d4ab17-9896-4998-a391-38740aabe347" containerName="mariadb-database-create" Dec 16 15:13:56 crc kubenswrapper[4728]: E1216 15:13:56.193957 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4129cf7-bc79-4961-aed2-ff704fd4c29e" containerName="mariadb-database-create" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.193966 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4129cf7-bc79-4961-aed2-ff704fd4c29e" containerName="mariadb-database-create" Dec 16 15:13:56 crc kubenswrapper[4728]: E1216 15:13:56.193982 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a1be12-1801-4429-90e9-120ecaa41788" containerName="mariadb-account-create-update" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.193989 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a1be12-1801-4429-90e9-120ecaa41788" containerName="mariadb-account-create-update" Dec 16 15:13:56 crc kubenswrapper[4728]: E1216 15:13:56.194016 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0b4d59-ea7a-453d-9bc3-7fb4122fc946" containerName="ovn-config" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.194024 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0b4d59-ea7a-453d-9bc3-7fb4122fc946" containerName="ovn-config" Dec 16 15:13:56 crc kubenswrapper[4728]: E1216 15:13:56.194034 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5fa45f-6924-4aca-b07f-f7a26af9ae1e" containerName="mariadb-account-create-update" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.194042 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5fa45f-6924-4aca-b07f-f7a26af9ae1e" containerName="mariadb-account-create-update" Dec 16 15:13:56 crc kubenswrapper[4728]: E1216 15:13:56.194060 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f" containerName="mariadb-account-create-update" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.194069 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f" containerName="mariadb-account-create-update" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.194244 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d4ab17-9896-4998-a391-38740aabe347" containerName="mariadb-database-create" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.194261 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4129cf7-bc79-4961-aed2-ff704fd4c29e" containerName="mariadb-database-create" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.194271 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c5fa45f-6924-4aca-b07f-f7a26af9ae1e" containerName="mariadb-account-create-update" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.194286 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="da29f3ca-e4e7-4f01-9dfa-315a928c25c3" containerName="mariadb-database-create" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.194298 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f" containerName="mariadb-account-create-update" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.194309 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="be0b4d59-ea7a-453d-9bc3-7fb4122fc946" containerName="ovn-config" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.194319 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a1be12-1801-4429-90e9-120ecaa41788" containerName="mariadb-account-create-update" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.195296 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.204756 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.240474 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-mqtfr"] Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.289449 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.289494 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.289521 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.289618 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.289932 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-config\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.289983 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9cnj\" (UniqueName: \"kubernetes.io/projected/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-kube-api-access-b9cnj\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.391309 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.392293 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.392293 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.391398 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.392391 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.392466 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.393136 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.393201 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-svc\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.393320 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-config\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.393346 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9cnj\" (UniqueName: \"kubernetes.io/projected/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-kube-api-access-b9cnj\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.394020 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-config\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.415370 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9cnj\" (UniqueName: \"kubernetes.io/projected/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-kube-api-access-b9cnj\") pod \"dnsmasq-dns-764c5664d7-mqtfr\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:56 crc kubenswrapper[4728]: I1216 15:13:56.531229 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:57 crc kubenswrapper[4728]: I1216 15:13:57.040235 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-mqtfr"] Dec 16 15:13:57 crc kubenswrapper[4728]: I1216 15:13:57.901543 4728 generic.go:334] "Generic (PLEG): container finished" podID="ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" containerID="6580a5eff80cb6cc0052bb777e6ebd1e630ed51488ce35e7fd9476ee552dfeda" exitCode=0 Dec 16 15:13:57 crc kubenswrapper[4728]: I1216 15:13:57.901585 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" event={"ID":"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8","Type":"ContainerDied","Data":"6580a5eff80cb6cc0052bb777e6ebd1e630ed51488ce35e7fd9476ee552dfeda"} Dec 16 15:13:57 crc kubenswrapper[4728]: I1216 15:13:57.902171 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" event={"ID":"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8","Type":"ContainerStarted","Data":"f43fb28f1bff9c148cfa16eabbe73931586a597307a5ab4627587eab5b69b651"} Dec 16 15:13:58 crc kubenswrapper[4728]: I1216 15:13:58.916327 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" event={"ID":"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8","Type":"ContainerStarted","Data":"4e7885af217273a7e5964c2be259e0655239f6c0fc5e9b640ed2aa2764c7e074"} Dec 16 15:13:58 crc kubenswrapper[4728]: I1216 15:13:58.916839 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:13:58 crc kubenswrapper[4728]: I1216 15:13:58.944241 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" podStartSLOduration=2.944222744 podStartE2EDuration="2.944222744s" podCreationTimestamp="2025-12-16 15:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:13:58.937960698 +0000 UTC m=+1019.778139732" watchObservedRunningTime="2025-12-16 15:13:58.944222744 +0000 UTC m=+1019.784401728" Dec 16 15:14:01 crc kubenswrapper[4728]: I1216 15:14:01.942974 4728 generic.go:334] "Generic (PLEG): container finished" podID="f19c009d-8b36-4b96-9995-541e097b4f21" containerID="2b79213d8f79d2599b4b96667c88887812fa01ba79424fd6e8768af991ad3dc7" exitCode=0 Dec 16 15:14:01 crc kubenswrapper[4728]: I1216 15:14:01.943051 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2lb8c" event={"ID":"f19c009d-8b36-4b96-9995-541e097b4f21","Type":"ContainerDied","Data":"2b79213d8f79d2599b4b96667c88887812fa01ba79424fd6e8768af991ad3dc7"} Dec 16 15:14:02 crc kubenswrapper[4728]: I1216 15:14:02.955524 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tnpp2" event={"ID":"316025cd-8999-4601-a3df-4aaf1dad3a83","Type":"ContainerStarted","Data":"f567ac43ddaa2e5b598e1f0c130271c009569705fae00d3d3d8098c1a09fd023"} Dec 16 15:14:02 crc kubenswrapper[4728]: I1216 15:14:02.977495 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-tnpp2" podStartSLOduration=1.704344225 podStartE2EDuration="36.977475996s" podCreationTimestamp="2025-12-16 15:13:26 +0000 UTC" firstStartedPulling="2025-12-16 15:13:27.107505933 +0000 UTC m=+987.947684917" lastFinishedPulling="2025-12-16 15:14:02.380637704 +0000 UTC m=+1023.220816688" observedRunningTime="2025-12-16 15:14:02.973295896 +0000 UTC m=+1023.813474880" watchObservedRunningTime="2025-12-16 15:14:02.977475996 +0000 UTC m=+1023.817654980" Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.259094 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.425033 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngc9h\" (UniqueName: \"kubernetes.io/projected/f19c009d-8b36-4b96-9995-541e097b4f21-kube-api-access-ngc9h\") pod \"f19c009d-8b36-4b96-9995-541e097b4f21\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.425265 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-combined-ca-bundle\") pod \"f19c009d-8b36-4b96-9995-541e097b4f21\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.425323 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-config-data\") pod \"f19c009d-8b36-4b96-9995-541e097b4f21\" (UID: \"f19c009d-8b36-4b96-9995-541e097b4f21\") " Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.434473 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f19c009d-8b36-4b96-9995-541e097b4f21-kube-api-access-ngc9h" (OuterVolumeSpecName: "kube-api-access-ngc9h") pod "f19c009d-8b36-4b96-9995-541e097b4f21" (UID: "f19c009d-8b36-4b96-9995-541e097b4f21"). InnerVolumeSpecName "kube-api-access-ngc9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.460665 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f19c009d-8b36-4b96-9995-541e097b4f21" (UID: "f19c009d-8b36-4b96-9995-541e097b4f21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.487312 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-config-data" (OuterVolumeSpecName: "config-data") pod "f19c009d-8b36-4b96-9995-541e097b4f21" (UID: "f19c009d-8b36-4b96-9995-541e097b4f21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.528387 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.528497 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19c009d-8b36-4b96-9995-541e097b4f21-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.528511 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngc9h\" (UniqueName: \"kubernetes.io/projected/f19c009d-8b36-4b96-9995-541e097b4f21-kube-api-access-ngc9h\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.967251 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2lb8c" event={"ID":"f19c009d-8b36-4b96-9995-541e097b4f21","Type":"ContainerDied","Data":"af09a250631b63d294e0db2752b3fd008ba329a8a112694570d03d29d44fbaef"} Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.967291 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af09a250631b63d294e0db2752b3fd008ba329a8a112694570d03d29d44fbaef" Dec 16 15:14:03 crc kubenswrapper[4728]: I1216 15:14:03.967359 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2lb8c" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.260851 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-mqtfr"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.261484 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" podUID="ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" containerName="dnsmasq-dns" containerID="cri-o://4e7885af217273a7e5964c2be259e0655239f6c0fc5e9b640ed2aa2764c7e074" gracePeriod=10 Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.263549 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.282924 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6r2wn"] Dec 16 15:14:04 crc kubenswrapper[4728]: E1216 15:14:04.284009 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f19c009d-8b36-4b96-9995-541e097b4f21" containerName="keystone-db-sync" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.284037 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19c009d-8b36-4b96-9995-541e097b4f21" containerName="keystone-db-sync" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.284467 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f19c009d-8b36-4b96-9995-541e097b4f21" containerName="keystone-db-sync" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.285333 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.291693 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.291931 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.294601 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.294838 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9cbd6" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.294978 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.307482 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6r2wn"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.345348 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-wvrgw"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.368269 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.398108 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-wvrgw"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.447503 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt2vh\" (UniqueName: \"kubernetes.io/projected/4d114940-603d-4295-9d87-0ae17259f37c-kube-api-access-bt2vh\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.447546 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-credential-keys\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.447569 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-config-data\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.447590 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-combined-ca-bundle\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.447608 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-scripts\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.447660 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-fernet-keys\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.449492 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9d9zb"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.452009 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.461632 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.461880 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-k7j25" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.462184 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.490464 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-777cc88b5c-n7899"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.491752 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.493138 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.494278 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.496096 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-8gq4b" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.504916 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.509860 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9d9zb"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.544546 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-777cc88b5c-n7899"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.548985 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-combined-ca-bundle\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549041 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt2vh\" (UniqueName: \"kubernetes.io/projected/4d114940-603d-4295-9d87-0ae17259f37c-kube-api-access-bt2vh\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549064 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-config\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549084 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t6gf\" (UniqueName: \"kubernetes.io/projected/4870315b-1db4-4d0b-b43d-ae5501b0e07a-kube-api-access-8t6gf\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549105 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-credential-keys\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549122 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-config-data\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549141 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-scripts\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549162 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-combined-ca-bundle\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549179 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-scripts\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549210 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-config-data\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549234 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mzdz\" (UniqueName: \"kubernetes.io/projected/f82109b1-c2b6-462c-8857-d0d8b243f64a-kube-api-access-5mzdz\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549269 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-fernet-keys\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549285 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-svc\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549304 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549340 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82109b1-c2b6-462c-8857-d0d8b243f64a-etc-machine-id\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549367 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-db-sync-config-data\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549383 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.549456 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.565297 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-credential-keys\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.565486 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-config-data\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.565941 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-combined-ca-bundle\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.569323 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-scripts\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.579245 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-fernet-keys\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.600950 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gzgrb"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.601988 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.606067 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt2vh\" (UniqueName: \"kubernetes.io/projected/4d114940-603d-4295-9d87-0ae17259f37c-kube-api-access-bt2vh\") pod \"keystone-bootstrap-6r2wn\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.611067 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.611346 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8qdc9" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.637927 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652417 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9542fe8-f57f-4771-ae28-700ce011aa51-logs\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652478 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9542fe8-f57f-4771-ae28-700ce011aa51-horizon-secret-key\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652538 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-config-data\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652586 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mzdz\" (UniqueName: \"kubernetes.io/projected/f82109b1-c2b6-462c-8857-d0d8b243f64a-kube-api-access-5mzdz\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652657 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-svc\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652684 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652767 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82109b1-c2b6-462c-8857-d0d8b243f64a-etc-machine-id\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652829 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-db-sync-config-data\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652857 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652883 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652943 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-combined-ca-bundle\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.652988 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-scripts\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.653012 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpljc\" (UniqueName: \"kubernetes.io/projected/d9542fe8-f57f-4771-ae28-700ce011aa51-kube-api-access-rpljc\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.653040 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-config\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.653064 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-config-data\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.653089 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t6gf\" (UniqueName: \"kubernetes.io/projected/4870315b-1db4-4d0b-b43d-ae5501b0e07a-kube-api-access-8t6gf\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.653133 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-scripts\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.654671 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-svc\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.655272 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.656354 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82109b1-c2b6-462c-8857-d0d8b243f64a-etc-machine-id\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.664163 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-db-sync-config-data\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.662393 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-config-data\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.670059 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.675043 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-scripts\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.679695 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.679846 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-config\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.717211 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gzgrb"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.746456 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-combined-ca-bundle\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.766947 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-db-sync-config-data\") pod \"barbican-db-sync-gzgrb\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.767033 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-scripts\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.767072 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpljc\" (UniqueName: \"kubernetes.io/projected/d9542fe8-f57f-4771-ae28-700ce011aa51-kube-api-access-rpljc\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.767095 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-config-data\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.767141 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfdsq\" (UniqueName: \"kubernetes.io/projected/60e129cb-0ce5-4289-a50b-2513ab8ba750-kube-api-access-mfdsq\") pod \"barbican-db-sync-gzgrb\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.767168 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9542fe8-f57f-4771-ae28-700ce011aa51-logs\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.767185 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9542fe8-f57f-4771-ae28-700ce011aa51-horizon-secret-key\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.767227 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-combined-ca-bundle\") pod \"barbican-db-sync-gzgrb\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.767810 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9542fe8-f57f-4771-ae28-700ce011aa51-logs\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.768856 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-scripts\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.769181 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-config-data\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.778595 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xfxvz"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.779784 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.790860 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hrsxm" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.791070 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.794016 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mzdz\" (UniqueName: \"kubernetes.io/projected/f82109b1-c2b6-462c-8857-d0d8b243f64a-kube-api-access-5mzdz\") pod \"cinder-db-sync-9d9zb\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.798624 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpljc\" (UniqueName: \"kubernetes.io/projected/d9542fe8-f57f-4771-ae28-700ce011aa51-kube-api-access-rpljc\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.798788 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.802645 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c5fb7899f-kbxcj"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.804583 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.805800 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9542fe8-f57f-4771-ae28-700ce011aa51-horizon-secret-key\") pod \"horizon-777cc88b5c-n7899\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.815024 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.824301 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.825475 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t6gf\" (UniqueName: \"kubernetes.io/projected/4870315b-1db4-4d0b-b43d-ae5501b0e07a-kube-api-access-8t6gf\") pod \"dnsmasq-dns-5959f8865f-wvrgw\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.839116 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-wvrgw"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.847989 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.868481 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-combined-ca-bundle\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.868554 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfdsq\" (UniqueName: \"kubernetes.io/projected/60e129cb-0ce5-4289-a50b-2513ab8ba750-kube-api-access-mfdsq\") pod \"barbican-db-sync-gzgrb\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.868577 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr7nn\" (UniqueName: \"kubernetes.io/projected/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-kube-api-access-hr7nn\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.868602 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-config-data\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.868617 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-logs\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.868635 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-scripts\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.868657 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-combined-ca-bundle\") pod \"barbican-db-sync-gzgrb\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.868724 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-db-sync-config-data\") pod \"barbican-db-sync-gzgrb\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.881492 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xfxvz"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.875452 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-combined-ca-bundle\") pod \"barbican-db-sync-gzgrb\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.891983 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-db-sync-config-data\") pod \"barbican-db-sync-gzgrb\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.937970 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfdsq\" (UniqueName: \"kubernetes.io/projected/60e129cb-0ce5-4289-a50b-2513ab8ba750-kube-api-access-mfdsq\") pod \"barbican-db-sync-gzgrb\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.954104 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c5fb7899f-kbxcj"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.954154 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hsjfc"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.955495 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.969995 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-horizon-secret-key\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.970093 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-combined-ca-bundle\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.970116 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-logs\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.970171 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-config-data\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.970210 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr7nn\" (UniqueName: \"kubernetes.io/projected/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-kube-api-access-hr7nn\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.970236 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n45gn\" (UniqueName: \"kubernetes.io/projected/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-kube-api-access-n45gn\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.970266 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-config-data\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.970286 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-logs\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.970312 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-scripts\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.970337 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-scripts\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.989703 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-combined-ca-bundle\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.990172 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-logs\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.991429 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hsjfc"] Dec 16 15:14:04 crc kubenswrapper[4728]: I1216 15:14:04.994999 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-config-data\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.008547 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-scripts\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.011551 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr7nn\" (UniqueName: \"kubernetes.io/projected/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-kube-api-access-hr7nn\") pod \"placement-db-sync-xfxvz\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.025597 4728 generic.go:334] "Generic (PLEG): container finished" podID="ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" containerID="4e7885af217273a7e5964c2be259e0655239f6c0fc5e9b640ed2aa2764c7e074" exitCode=0 Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.025655 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wcv69"] Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.026724 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" event={"ID":"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8","Type":"ContainerDied","Data":"4e7885af217273a7e5964c2be259e0655239f6c0fc5e9b640ed2aa2764c7e074"} Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.026812 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wcv69" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.039519 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tkr4w" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.039601 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.040045 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.049022 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wcv69"] Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.077124 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-horizon-secret-key\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.077224 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfpnd\" (UniqueName: \"kubernetes.io/projected/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-kube-api-access-kfpnd\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.077267 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-logs\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.077330 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-config-data\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.077353 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.077396 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.077433 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-config\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.077464 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n45gn\" (UniqueName: \"kubernetes.io/projected/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-kube-api-access-n45gn\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.077510 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.081498 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-scripts\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.081582 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.091195 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.091651 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-logs\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.091216 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-horizon-secret-key\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.097567 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-config-data\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.103542 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-scripts\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.111027 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.116619 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.116666 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.117748 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n45gn\" (UniqueName: \"kubernetes.io/projected/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-kube-api-access-n45gn\") pod \"horizon-5c5fb7899f-kbxcj\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.193936 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-combined-ca-bundle\") pod \"neutron-db-sync-wcv69\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " pod="openstack/neutron-db-sync-wcv69" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.193984 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccc6l\" (UniqueName: \"kubernetes.io/projected/04fe707b-a597-4768-8190-6efb7aea9faa-kube-api-access-ccc6l\") pod \"neutron-db-sync-wcv69\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " pod="openstack/neutron-db-sync-wcv69" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194008 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194031 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194051 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194070 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-config\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194093 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194107 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-config-data\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194126 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-log-httpd\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194157 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194238 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-scripts\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194389 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9h2w\" (UniqueName: \"kubernetes.io/projected/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-kube-api-access-p9h2w\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194467 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-config\") pod \"neutron-db-sync-wcv69\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " pod="openstack/neutron-db-sync-wcv69" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194572 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194616 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-run-httpd\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194642 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfpnd\" (UniqueName: \"kubernetes.io/projected/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-kube-api-access-kfpnd\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194990 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.194989 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.195489 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.195526 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.199030 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-config\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.219719 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.222591 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.224471 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.225319 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfpnd\" (UniqueName: \"kubernetes.io/projected/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-kube-api-access-kfpnd\") pod \"dnsmasq-dns-58dd9ff6bc-hsjfc\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.255925 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xfxvz" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.269257 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301144 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-sb\") pod \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301223 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9cnj\" (UniqueName: \"kubernetes.io/projected/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-kube-api-access-b9cnj\") pod \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301362 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-config\") pod \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301391 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-swift-storage-0\") pod \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301452 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-nb\") pod \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301476 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-svc\") pod \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\" (UID: \"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8\") " Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301676 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-scripts\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301715 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9h2w\" (UniqueName: \"kubernetes.io/projected/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-kube-api-access-p9h2w\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301745 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-config\") pod \"neutron-db-sync-wcv69\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " pod="openstack/neutron-db-sync-wcv69" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301795 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301821 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-run-httpd\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301851 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-combined-ca-bundle\") pod \"neutron-db-sync-wcv69\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " pod="openstack/neutron-db-sync-wcv69" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301875 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccc6l\" (UniqueName: \"kubernetes.io/projected/04fe707b-a597-4768-8190-6efb7aea9faa-kube-api-access-ccc6l\") pod \"neutron-db-sync-wcv69\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " pod="openstack/neutron-db-sync-wcv69" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301895 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301924 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-config-data\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.301945 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-log-httpd\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.304284 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-log-httpd\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.318245 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.318811 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-run-httpd\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.319027 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-kube-api-access-b9cnj" (OuterVolumeSpecName: "kube-api-access-b9cnj") pod "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" (UID: "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8"). InnerVolumeSpecName "kube-api-access-b9cnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.329254 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.330264 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9h2w\" (UniqueName: \"kubernetes.io/projected/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-kube-api-access-p9h2w\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.331905 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccc6l\" (UniqueName: \"kubernetes.io/projected/04fe707b-a597-4768-8190-6efb7aea9faa-kube-api-access-ccc6l\") pod \"neutron-db-sync-wcv69\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " pod="openstack/neutron-db-sync-wcv69" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.334181 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-scripts\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.334762 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.337472 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-config\") pod \"neutron-db-sync-wcv69\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " pod="openstack/neutron-db-sync-wcv69" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.338105 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-combined-ca-bundle\") pod \"neutron-db-sync-wcv69\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " pod="openstack/neutron-db-sync-wcv69" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.361941 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-config-data\") pod \"ceilometer-0\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.364622 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wcv69" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.405029 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9cnj\" (UniqueName: \"kubernetes.io/projected/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-kube-api-access-b9cnj\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.408194 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" (UID: "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.451375 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-config" (OuterVolumeSpecName: "config") pod "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" (UID: "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.481900 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.488424 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" (UID: "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.502497 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" (UID: "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.511334 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.511355 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.511363 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.511373 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.513370 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" (UID: "ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.566680 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6r2wn"] Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.614139 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.631107 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-wvrgw"] Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.704612 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-777cc88b5c-n7899"] Dec 16 15:14:05 crc kubenswrapper[4728]: I1216 15:14:05.719264 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9d9zb"] Dec 16 15:14:05 crc kubenswrapper[4728]: W1216 15:14:05.813313 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d114940_603d_4295_9d87_0ae17259f37c.slice/crio-43d7dd6d7e16a95d59b9d3d86e7efbec2aa4de27083e6d51a5af471a3f4a730d WatchSource:0}: Error finding container 43d7dd6d7e16a95d59b9d3d86e7efbec2aa4de27083e6d51a5af471a3f4a730d: Status 404 returned error can't find the container with id 43d7dd6d7e16a95d59b9d3d86e7efbec2aa4de27083e6d51a5af471a3f4a730d Dec 16 15:14:05 crc kubenswrapper[4728]: W1216 15:14:05.816627 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4870315b_1db4_4d0b_b43d_ae5501b0e07a.slice/crio-56c22f282e0dbde1e220a32bcb5b073206a3899e5628e177c5bb7bb1f11bc11c WatchSource:0}: Error finding container 56c22f282e0dbde1e220a32bcb5b073206a3899e5628e177c5bb7bb1f11bc11c: Status 404 returned error can't find the container with id 56c22f282e0dbde1e220a32bcb5b073206a3899e5628e177c5bb7bb1f11bc11c Dec 16 15:14:05 crc kubenswrapper[4728]: W1216 15:14:05.822725 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf82109b1_c2b6_462c_8857_d0d8b243f64a.slice/crio-602f0ca44a3ff7f8c6da99b8662d66dc3eede1e66157a97b462bc306b9102e68 WatchSource:0}: Error finding container 602f0ca44a3ff7f8c6da99b8662d66dc3eede1e66157a97b462bc306b9102e68: Status 404 returned error can't find the container with id 602f0ca44a3ff7f8c6da99b8662d66dc3eede1e66157a97b462bc306b9102e68 Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.015830 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c5fb7899f-kbxcj"] Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.029683 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xfxvz"] Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.040544 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gzgrb"] Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.056997 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777cc88b5c-n7899" event={"ID":"d9542fe8-f57f-4771-ae28-700ce011aa51","Type":"ContainerStarted","Data":"f5ad846e3c0efa328591c8a95af670965eb9d818f2b3b3ffe444fcd952cd3e66"} Dec 16 15:14:06 crc kubenswrapper[4728]: W1216 15:14:06.059813 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60e129cb_0ce5_4289_a50b_2513ab8ba750.slice/crio-c27d2aadcc6166e6229f3a56c923d05530285d57d32a66907113231a9a043aaf WatchSource:0}: Error finding container c27d2aadcc6166e6229f3a56c923d05530285d57d32a66907113231a9a043aaf: Status 404 returned error can't find the container with id c27d2aadcc6166e6229f3a56c923d05530285d57d32a66907113231a9a043aaf Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.066394 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" event={"ID":"4870315b-1db4-4d0b-b43d-ae5501b0e07a","Type":"ContainerStarted","Data":"56c22f282e0dbde1e220a32bcb5b073206a3899e5628e177c5bb7bb1f11bc11c"} Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.068060 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hsjfc"] Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.074898 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" event={"ID":"ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8","Type":"ContainerDied","Data":"f43fb28f1bff9c148cfa16eabbe73931586a597307a5ab4627587eab5b69b651"} Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.074939 4728 scope.go:117] "RemoveContainer" containerID="4e7885af217273a7e5964c2be259e0655239f6c0fc5e9b640ed2aa2764c7e074" Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.075050 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-mqtfr" Dec 16 15:14:06 crc kubenswrapper[4728]: W1216 15:14:06.075210 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51a0c6b8_4f21_45f6_bfd7_e327b00f5399.slice/crio-971c6362b1e86e4da020431547a22cf531ee9ca48728940865b69c36096fb79c WatchSource:0}: Error finding container 971c6362b1e86e4da020431547a22cf531ee9ca48728940865b69c36096fb79c: Status 404 returned error can't find the container with id 971c6362b1e86e4da020431547a22cf531ee9ca48728940865b69c36096fb79c Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.089628 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6r2wn" event={"ID":"4d114940-603d-4295-9d87-0ae17259f37c","Type":"ContainerStarted","Data":"43d7dd6d7e16a95d59b9d3d86e7efbec2aa4de27083e6d51a5af471a3f4a730d"} Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.099956 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c5fb7899f-kbxcj" event={"ID":"d6394bdf-e558-4cf1-93b5-7d84ae2318a3","Type":"ContainerStarted","Data":"69c119f00f060d47820c26587cb6094102b8af264c23b9cb337e03deb7f8305d"} Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.104393 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9d9zb" event={"ID":"f82109b1-c2b6-462c-8857-d0d8b243f64a","Type":"ContainerStarted","Data":"602f0ca44a3ff7f8c6da99b8662d66dc3eede1e66157a97b462bc306b9102e68"} Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.105749 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-mqtfr"] Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.112212 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-mqtfr"] Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.120783 4728 scope.go:117] "RemoveContainer" containerID="6580a5eff80cb6cc0052bb777e6ebd1e630ed51488ce35e7fd9476ee552dfeda" Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.186585 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wcv69"] Dec 16 15:14:06 crc kubenswrapper[4728]: W1216 15:14:06.190617 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04fe707b_a597_4768_8190_6efb7aea9faa.slice/crio-2c86b28d66820af505511bc5e5f98be31d25d21a265ad6e6a9cfe2fa63533efa WatchSource:0}: Error finding container 2c86b28d66820af505511bc5e5f98be31d25d21a265ad6e6a9cfe2fa63533efa: Status 404 returned error can't find the container with id 2c86b28d66820af505511bc5e5f98be31d25d21a265ad6e6a9cfe2fa63533efa Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.493380 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.884240 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c5fb7899f-kbxcj"] Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.904240 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.936026 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f986cfd8f-glc7x"] Dec 16 15:14:06 crc kubenswrapper[4728]: E1216 15:14:06.936387 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" containerName="init" Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.936399 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" containerName="init" Dec 16 15:14:06 crc kubenswrapper[4728]: E1216 15:14:06.936451 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" containerName="dnsmasq-dns" Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.936458 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" containerName="dnsmasq-dns" Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.936644 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" containerName="dnsmasq-dns" Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.937464 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:06 crc kubenswrapper[4728]: I1216 15:14:06.947098 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f986cfd8f-glc7x"] Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.046265 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a59cc6f-49ec-464b-9444-21f249ed771b-logs\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.046310 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-config-data\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.046359 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-scripts\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.046377 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a59cc6f-49ec-464b-9444-21f249ed771b-horizon-secret-key\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.046454 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsv2n\" (UniqueName: \"kubernetes.io/projected/1a59cc6f-49ec-464b-9444-21f249ed771b-kube-api-access-rsv2n\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.134849 4728 generic.go:334] "Generic (PLEG): container finished" podID="4870315b-1db4-4d0b-b43d-ae5501b0e07a" containerID="5e9bca9c581972dd53fd82556216ca6dfa6aebea478f56ddc60f20e087e5f1ef" exitCode=0 Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.135128 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" event={"ID":"4870315b-1db4-4d0b-b43d-ae5501b0e07a","Type":"ContainerDied","Data":"5e9bca9c581972dd53fd82556216ca6dfa6aebea478f56ddc60f20e087e5f1ef"} Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.150716 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a59cc6f-49ec-464b-9444-21f249ed771b-logs\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.150798 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-config-data\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.151600 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-scripts\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.151666 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a59cc6f-49ec-464b-9444-21f249ed771b-horizon-secret-key\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.151835 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsv2n\" (UniqueName: \"kubernetes.io/projected/1a59cc6f-49ec-464b-9444-21f249ed771b-kube-api-access-rsv2n\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.152938 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a59cc6f-49ec-464b-9444-21f249ed771b-logs\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.155694 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-config-data\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.156234 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-scripts\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.161773 4728 generic.go:334] "Generic (PLEG): container finished" podID="51a0c6b8-4f21-45f6-bfd7-e327b00f5399" containerID="5c1d9754059a6b2191c6717276e1da8c47384dbe1fa68eb615636a1acff43027" exitCode=0 Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.161880 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a59cc6f-49ec-464b-9444-21f249ed771b-horizon-secret-key\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.161897 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" event={"ID":"51a0c6b8-4f21-45f6-bfd7-e327b00f5399","Type":"ContainerDied","Data":"5c1d9754059a6b2191c6717276e1da8c47384dbe1fa68eb615636a1acff43027"} Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.161924 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" event={"ID":"51a0c6b8-4f21-45f6-bfd7-e327b00f5399","Type":"ContainerStarted","Data":"971c6362b1e86e4da020431547a22cf531ee9ca48728940865b69c36096fb79c"} Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.180602 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wcv69" event={"ID":"04fe707b-a597-4768-8190-6efb7aea9faa","Type":"ContainerStarted","Data":"330bb68c46623995b5939a30a6e76c4843fed254b676083b2d151a5ad3c2433e"} Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.180644 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wcv69" event={"ID":"04fe707b-a597-4768-8190-6efb7aea9faa","Type":"ContainerStarted","Data":"2c86b28d66820af505511bc5e5f98be31d25d21a265ad6e6a9cfe2fa63533efa"} Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.184344 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsv2n\" (UniqueName: \"kubernetes.io/projected/1a59cc6f-49ec-464b-9444-21f249ed771b-kube-api-access-rsv2n\") pod \"horizon-7f986cfd8f-glc7x\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.198567 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce","Type":"ContainerStarted","Data":"fc6ec3c43415a7f2328272ec27591e6e1be18f0456a67f5da211ef8eac91e127"} Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.215072 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wcv69" podStartSLOduration=3.215055778 podStartE2EDuration="3.215055778s" podCreationTimestamp="2025-12-16 15:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:07.211717019 +0000 UTC m=+1028.051896003" watchObservedRunningTime="2025-12-16 15:14:07.215055778 +0000 UTC m=+1028.055234762" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.222695 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6r2wn" event={"ID":"4d114940-603d-4295-9d87-0ae17259f37c","Type":"ContainerStarted","Data":"f60f105b427383c69359296263469ae00ab54d3f8189124ab7a543f226cd6d2e"} Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.224521 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gzgrb" event={"ID":"60e129cb-0ce5-4289-a50b-2513ab8ba750","Type":"ContainerStarted","Data":"c27d2aadcc6166e6229f3a56c923d05530285d57d32a66907113231a9a043aaf"} Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.227443 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xfxvz" event={"ID":"d8cfd92c-8ec9-4d81-a119-2c35893fba2b","Type":"ContainerStarted","Data":"84cba1f7ac61b4c0c1fddb66e5cfb96cd040347a1f4a5e2e2d38d346dc6f3c0c"} Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.248197 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6r2wn" podStartSLOduration=3.248179808 podStartE2EDuration="3.248179808s" podCreationTimestamp="2025-12-16 15:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:07.247562602 +0000 UTC m=+1028.087741586" watchObservedRunningTime="2025-12-16 15:14:07.248179808 +0000 UTC m=+1028.088358792" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.266035 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.526127 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8" path="/var/lib/kubelet/pods/ffcfff7f-c49d-4ee0-81fb-f3ed9086d5c8/volumes" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.680153 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.765425 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-sb\") pod \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.765473 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-nb\") pod \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.765556 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-swift-storage-0\") pod \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.765625 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-config\") pod \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.765668 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t6gf\" (UniqueName: \"kubernetes.io/projected/4870315b-1db4-4d0b-b43d-ae5501b0e07a-kube-api-access-8t6gf\") pod \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.765795 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-svc\") pod \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\" (UID: \"4870315b-1db4-4d0b-b43d-ae5501b0e07a\") " Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.788721 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4870315b-1db4-4d0b-b43d-ae5501b0e07a" (UID: "4870315b-1db4-4d0b-b43d-ae5501b0e07a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.796175 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4870315b-1db4-4d0b-b43d-ae5501b0e07a" (UID: "4870315b-1db4-4d0b-b43d-ae5501b0e07a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.805702 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4870315b-1db4-4d0b-b43d-ae5501b0e07a" (UID: "4870315b-1db4-4d0b-b43d-ae5501b0e07a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.817091 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-config" (OuterVolumeSpecName: "config") pod "4870315b-1db4-4d0b-b43d-ae5501b0e07a" (UID: "4870315b-1db4-4d0b-b43d-ae5501b0e07a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.821882 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f986cfd8f-glc7x"] Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.847896 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4870315b-1db4-4d0b-b43d-ae5501b0e07a" (UID: "4870315b-1db4-4d0b-b43d-ae5501b0e07a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.867785 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.867821 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.867833 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.867841 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:07 crc kubenswrapper[4728]: I1216 15:14:07.867849 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4870315b-1db4-4d0b-b43d-ae5501b0e07a-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.245385 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f986cfd8f-glc7x" event={"ID":"1a59cc6f-49ec-464b-9444-21f249ed771b","Type":"ContainerStarted","Data":"57b8ffb9cdc70792c55baa26e3a36933823d2103d4bea0d505b205bfc17db0cb"} Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.247886 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" event={"ID":"4870315b-1db4-4d0b-b43d-ae5501b0e07a","Type":"ContainerDied","Data":"56c22f282e0dbde1e220a32bcb5b073206a3899e5628e177c5bb7bb1f11bc11c"} Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.247914 4728 scope.go:117] "RemoveContainer" containerID="5e9bca9c581972dd53fd82556216ca6dfa6aebea478f56ddc60f20e087e5f1ef" Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.248018 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-wvrgw" Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.254371 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" event={"ID":"51a0c6b8-4f21-45f6-bfd7-e327b00f5399","Type":"ContainerStarted","Data":"38c520442ab8dc341b03d0615ed829eeef1442a838e4f12e9a497b95fb526f4e"} Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.254585 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.398846 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4870315b-1db4-4d0b-b43d-ae5501b0e07a-kube-api-access-8t6gf" (OuterVolumeSpecName: "kube-api-access-8t6gf") pod "4870315b-1db4-4d0b-b43d-ae5501b0e07a" (UID: "4870315b-1db4-4d0b-b43d-ae5501b0e07a"). InnerVolumeSpecName "kube-api-access-8t6gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.477144 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t6gf\" (UniqueName: \"kubernetes.io/projected/4870315b-1db4-4d0b-b43d-ae5501b0e07a-kube-api-access-8t6gf\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.644106 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" podStartSLOduration=4.644090813 podStartE2EDuration="4.644090813s" podCreationTimestamp="2025-12-16 15:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:08.272787062 +0000 UTC m=+1029.112966046" watchObservedRunningTime="2025-12-16 15:14:08.644090813 +0000 UTC m=+1029.484269797" Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.657144 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-wvrgw"] Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.665917 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-wvrgw"] Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.818782 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:14:08 crc kubenswrapper[4728]: I1216 15:14:08.818842 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:14:09 crc kubenswrapper[4728]: I1216 15:14:09.517984 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4870315b-1db4-4d0b-b43d-ae5501b0e07a" path="/var/lib/kubelet/pods/4870315b-1db4-4d0b-b43d-ae5501b0e07a/volumes" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.174354 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-777cc88b5c-n7899"] Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.200893 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-589dd4bc84-6zndr"] Dec 16 15:14:13 crc kubenswrapper[4728]: E1216 15:14:13.201425 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4870315b-1db4-4d0b-b43d-ae5501b0e07a" containerName="init" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.201449 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4870315b-1db4-4d0b-b43d-ae5501b0e07a" containerName="init" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.201702 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4870315b-1db4-4d0b-b43d-ae5501b0e07a" containerName="init" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.202913 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.207460 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.235565 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-589dd4bc84-6zndr"] Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.278927 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f986cfd8f-glc7x"] Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.305229 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7585b44dcb-46w99"] Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.310902 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.318064 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7585b44dcb-46w99"] Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.381358 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-config-data\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.381399 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33646e3-23f5-40a1-88ef-f55bdd5a230c-logs\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.381437 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-scripts\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.381632 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tz9g\" (UniqueName: \"kubernetes.io/projected/f33646e3-23f5-40a1-88ef-f55bdd5a230c-kube-api-access-4tz9g\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.381741 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-combined-ca-bundle\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.381853 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-secret-key\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.381880 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-tls-certs\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.483944 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-combined-ca-bundle\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.484009 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-secret-key\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.484026 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-tls-certs\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.484060 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac195fba-37cf-48a1-aa91-c9df824ddfe4-scripts\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.484090 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac195fba-37cf-48a1-aa91-c9df824ddfe4-horizon-tls-certs\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.484179 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac195fba-37cf-48a1-aa91-c9df824ddfe4-logs\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.484201 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ac195fba-37cf-48a1-aa91-c9df824ddfe4-horizon-secret-key\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.484261 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-config-data\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.485640 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-config-data\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.485677 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4vqb\" (UniqueName: \"kubernetes.io/projected/ac195fba-37cf-48a1-aa91-c9df824ddfe4-kube-api-access-d4vqb\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.485703 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33646e3-23f5-40a1-88ef-f55bdd5a230c-logs\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.485725 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-scripts\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.485755 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tz9g\" (UniqueName: \"kubernetes.io/projected/f33646e3-23f5-40a1-88ef-f55bdd5a230c-kube-api-access-4tz9g\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.485789 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac195fba-37cf-48a1-aa91-c9df824ddfe4-config-data\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.485822 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac195fba-37cf-48a1-aa91-c9df824ddfe4-combined-ca-bundle\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.487339 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-scripts\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.487592 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33646e3-23f5-40a1-88ef-f55bdd5a230c-logs\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.498138 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-secret-key\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.498166 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-tls-certs\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.498344 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-combined-ca-bundle\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.504804 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tz9g\" (UniqueName: \"kubernetes.io/projected/f33646e3-23f5-40a1-88ef-f55bdd5a230c-kube-api-access-4tz9g\") pod \"horizon-589dd4bc84-6zndr\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.524270 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.587744 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac195fba-37cf-48a1-aa91-c9df824ddfe4-combined-ca-bundle\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.588019 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac195fba-37cf-48a1-aa91-c9df824ddfe4-config-data\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.588286 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac195fba-37cf-48a1-aa91-c9df824ddfe4-scripts\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.588461 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac195fba-37cf-48a1-aa91-c9df824ddfe4-horizon-tls-certs\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.588616 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac195fba-37cf-48a1-aa91-c9df824ddfe4-logs\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.588733 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ac195fba-37cf-48a1-aa91-c9df824ddfe4-horizon-secret-key\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.588855 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4vqb\" (UniqueName: \"kubernetes.io/projected/ac195fba-37cf-48a1-aa91-c9df824ddfe4-kube-api-access-d4vqb\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.589654 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac195fba-37cf-48a1-aa91-c9df824ddfe4-config-data\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.589906 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac195fba-37cf-48a1-aa91-c9df824ddfe4-logs\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.590450 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac195fba-37cf-48a1-aa91-c9df824ddfe4-scripts\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.592320 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac195fba-37cf-48a1-aa91-c9df824ddfe4-combined-ca-bundle\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.593399 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac195fba-37cf-48a1-aa91-c9df824ddfe4-horizon-tls-certs\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.594069 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ac195fba-37cf-48a1-aa91-c9df824ddfe4-horizon-secret-key\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.607601 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4vqb\" (UniqueName: \"kubernetes.io/projected/ac195fba-37cf-48a1-aa91-c9df824ddfe4-kube-api-access-d4vqb\") pod \"horizon-7585b44dcb-46w99\" (UID: \"ac195fba-37cf-48a1-aa91-c9df824ddfe4\") " pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:13 crc kubenswrapper[4728]: I1216 15:14:13.637502 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:15 crc kubenswrapper[4728]: I1216 15:14:15.312479 4728 generic.go:334] "Generic (PLEG): container finished" podID="4d114940-603d-4295-9d87-0ae17259f37c" containerID="f60f105b427383c69359296263469ae00ab54d3f8189124ab7a543f226cd6d2e" exitCode=0 Dec 16 15:14:15 crc kubenswrapper[4728]: I1216 15:14:15.312784 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6r2wn" event={"ID":"4d114940-603d-4295-9d87-0ae17259f37c","Type":"ContainerDied","Data":"f60f105b427383c69359296263469ae00ab54d3f8189124ab7a543f226cd6d2e"} Dec 16 15:14:15 crc kubenswrapper[4728]: I1216 15:14:15.320595 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:15 crc kubenswrapper[4728]: I1216 15:14:15.414042 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jmrcl"] Dec 16 15:14:15 crc kubenswrapper[4728]: I1216 15:14:15.414313 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-jmrcl" podUID="a7035612-bffa-4357-aa7e-897240b10c43" containerName="dnsmasq-dns" containerID="cri-o://81615126d01d5494e4472780ee7dfba71331aadf757f7b5a51b1e7d2ef5ba66b" gracePeriod=10 Dec 16 15:14:16 crc kubenswrapper[4728]: I1216 15:14:16.348346 4728 generic.go:334] "Generic (PLEG): container finished" podID="a7035612-bffa-4357-aa7e-897240b10c43" containerID="81615126d01d5494e4472780ee7dfba71331aadf757f7b5a51b1e7d2ef5ba66b" exitCode=0 Dec 16 15:14:16 crc kubenswrapper[4728]: I1216 15:14:16.348446 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jmrcl" event={"ID":"a7035612-bffa-4357-aa7e-897240b10c43","Type":"ContainerDied","Data":"81615126d01d5494e4472780ee7dfba71331aadf757f7b5a51b1e7d2ef5ba66b"} Dec 16 15:14:21 crc kubenswrapper[4728]: I1216 15:14:21.267973 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jmrcl" podUID="a7035612-bffa-4357-aa7e-897240b10c43" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Dec 16 15:14:22 crc kubenswrapper[4728]: E1216 15:14:22.749437 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 16 15:14:22 crc kubenswrapper[4728]: E1216 15:14:22.749976 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hr7nn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-xfxvz_openstack(d8cfd92c-8ec9-4d81-a119-2c35893fba2b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:14:22 crc kubenswrapper[4728]: E1216 15:14:22.751207 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-xfxvz" podUID="d8cfd92c-8ec9-4d81-a119-2c35893fba2b" Dec 16 15:14:23 crc kubenswrapper[4728]: E1216 15:14:23.407610 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-xfxvz" podUID="d8cfd92c-8ec9-4d81-a119-2c35893fba2b" Dec 16 15:14:26 crc kubenswrapper[4728]: I1216 15:14:26.269141 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jmrcl" podUID="a7035612-bffa-4357-aa7e-897240b10c43" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.690004 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.690669 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b9hdbh5f5h5c5h655h664h59fhd7h686h58fh78h5cch5dbh5f4h5ch68bh57bh586h54fh64dhc4h5dfh597hf7h57ch74h55bh578h57dh5f4hbch74q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n45gn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5c5fb7899f-kbxcj_openstack(d6394bdf-e558-4cf1-93b5-7d84ae2318a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.698307 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:f270f645e84ca176715df916e0295cff2d2c87909c8bff7cf4f7a389896e9d82: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-cinder-api/blobs/sha256:f270f645e84ca176715df916e0295cff2d2c87909c8bff7cf4f7a389896e9d82\": context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.698556 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mzdz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9d9zb_openstack(f82109b1-c2b6-462c-8857-d0d8b243f64a): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:f270f645e84ca176715df916e0295cff2d2c87909c8bff7cf4f7a389896e9d82: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-cinder-api/blobs/sha256:f270f645e84ca176715df916e0295cff2d2c87909c8bff7cf4f7a389896e9d82\": context canceled" logger="UnhandledError" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.699692 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:f270f645e84ca176715df916e0295cff2d2c87909c8bff7cf4f7a389896e9d82: Get \\\"https://quay.io/v2/podified-antelope-centos9/openstack-cinder-api/blobs/sha256:f270f645e84ca176715df916e0295cff2d2c87909c8bff7cf4f7a389896e9d82\\\": context canceled\"" pod="openstack/cinder-db-sync-9d9zb" podUID="f82109b1-c2b6-462c-8857-d0d8b243f64a" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.702194 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.702311 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5ffh675h9bh64h668h674h8fhcdh54chd7hdchdch5dbh599h658h5d9h5fhbch67ch577hfch65bh669hf4h57dhb4h598h64dh5c6h599hbfh699q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rsv2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7f986cfd8f-glc7x_openstack(1a59cc6f-49ec-464b-9444-21f249ed771b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.703570 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5c5fb7899f-kbxcj" podUID="d6394bdf-e558-4cf1-93b5-7d84ae2318a3" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.706618 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7f986cfd8f-glc7x" podUID="1a59cc6f-49ec-464b-9444-21f249ed771b" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.742906 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.743044 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n74h687h7dh7dh5c7h54bh4h5dh78h65ch95h549h5dfhbch99hb7h678h674h647h668h78h548h695h5f4hdh56h56bh656h598h684h5d4h64q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rpljc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-777cc88b5c-n7899_openstack(d9542fe8-f57f-4771-ae28-700ce011aa51): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:14:29 crc kubenswrapper[4728]: E1216 15:14:29.745254 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-777cc88b5c-n7899" podUID="d9542fe8-f57f-4771-ae28-700ce011aa51" Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.762128 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.766567 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.936015 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-dns-svc\") pod \"a7035612-bffa-4357-aa7e-897240b10c43\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.936070 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-credential-keys\") pod \"4d114940-603d-4295-9d87-0ae17259f37c\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.936153 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-fernet-keys\") pod \"4d114940-603d-4295-9d87-0ae17259f37c\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.936193 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqffm\" (UniqueName: \"kubernetes.io/projected/a7035612-bffa-4357-aa7e-897240b10c43-kube-api-access-sqffm\") pod \"a7035612-bffa-4357-aa7e-897240b10c43\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.936215 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-combined-ca-bundle\") pod \"4d114940-603d-4295-9d87-0ae17259f37c\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.936239 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-nb\") pod \"a7035612-bffa-4357-aa7e-897240b10c43\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.936284 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-sb\") pod \"a7035612-bffa-4357-aa7e-897240b10c43\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.936336 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt2vh\" (UniqueName: \"kubernetes.io/projected/4d114940-603d-4295-9d87-0ae17259f37c-kube-api-access-bt2vh\") pod \"4d114940-603d-4295-9d87-0ae17259f37c\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.936363 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-config-data\") pod \"4d114940-603d-4295-9d87-0ae17259f37c\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.936417 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-scripts\") pod \"4d114940-603d-4295-9d87-0ae17259f37c\" (UID: \"4d114940-603d-4295-9d87-0ae17259f37c\") " Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.936437 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-config\") pod \"a7035612-bffa-4357-aa7e-897240b10c43\" (UID: \"a7035612-bffa-4357-aa7e-897240b10c43\") " Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.943632 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4d114940-603d-4295-9d87-0ae17259f37c" (UID: "4d114940-603d-4295-9d87-0ae17259f37c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.943668 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7035612-bffa-4357-aa7e-897240b10c43-kube-api-access-sqffm" (OuterVolumeSpecName: "kube-api-access-sqffm") pod "a7035612-bffa-4357-aa7e-897240b10c43" (UID: "a7035612-bffa-4357-aa7e-897240b10c43"). InnerVolumeSpecName "kube-api-access-sqffm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.943714 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4d114940-603d-4295-9d87-0ae17259f37c" (UID: "4d114940-603d-4295-9d87-0ae17259f37c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.943865 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d114940-603d-4295-9d87-0ae17259f37c-kube-api-access-bt2vh" (OuterVolumeSpecName: "kube-api-access-bt2vh") pod "4d114940-603d-4295-9d87-0ae17259f37c" (UID: "4d114940-603d-4295-9d87-0ae17259f37c"). InnerVolumeSpecName "kube-api-access-bt2vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.944957 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-scripts" (OuterVolumeSpecName: "scripts") pod "4d114940-603d-4295-9d87-0ae17259f37c" (UID: "4d114940-603d-4295-9d87-0ae17259f37c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.972953 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d114940-603d-4295-9d87-0ae17259f37c" (UID: "4d114940-603d-4295-9d87-0ae17259f37c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.973507 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-config-data" (OuterVolumeSpecName: "config-data") pod "4d114940-603d-4295-9d87-0ae17259f37c" (UID: "4d114940-603d-4295-9d87-0ae17259f37c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:29 crc kubenswrapper[4728]: I1216 15:14:29.999093 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a7035612-bffa-4357-aa7e-897240b10c43" (UID: "a7035612-bffa-4357-aa7e-897240b10c43"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.007136 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a7035612-bffa-4357-aa7e-897240b10c43" (UID: "a7035612-bffa-4357-aa7e-897240b10c43"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.021923 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-config" (OuterVolumeSpecName: "config") pod "a7035612-bffa-4357-aa7e-897240b10c43" (UID: "a7035612-bffa-4357-aa7e-897240b10c43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.037761 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt2vh\" (UniqueName: \"kubernetes.io/projected/4d114940-603d-4295-9d87-0ae17259f37c-kube-api-access-bt2vh\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.037786 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.037796 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.037805 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.037815 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.037824 4728 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.037832 4728 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.037840 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqffm\" (UniqueName: \"kubernetes.io/projected/a7035612-bffa-4357-aa7e-897240b10c43-kube-api-access-sqffm\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.037875 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d114940-603d-4295-9d87-0ae17259f37c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.037885 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.044838 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a7035612-bffa-4357-aa7e-897240b10c43" (UID: "a7035612-bffa-4357-aa7e-897240b10c43"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.139349 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7035612-bffa-4357-aa7e-897240b10c43-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:30 crc kubenswrapper[4728]: E1216 15:14:30.157247 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 16 15:14:30 crc kubenswrapper[4728]: E1216 15:14:30.157452 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n85h66ch65fhfhd6hf4h55fh696h55dhfdh9h699h66fh5cfh568h6ch89h5c8hb4h677h66h5f6h56dh569h698h596h84hcfh657hf6h55dh5f6q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9h2w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.484965 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gzgrb" event={"ID":"60e129cb-0ce5-4289-a50b-2513ab8ba750","Type":"ContainerStarted","Data":"c0977b96e79053722108b55a7d914b08ea2818872b7ad4424a80507ef69b89f8"} Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.489025 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jmrcl" event={"ID":"a7035612-bffa-4357-aa7e-897240b10c43","Type":"ContainerDied","Data":"73281e5377ce7eb69d7ece92f1102d4238b126bbf8a093e07625156d918ce09d"} Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.489048 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jmrcl" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.489068 4728 scope.go:117] "RemoveContainer" containerID="81615126d01d5494e4472780ee7dfba71331aadf757f7b5a51b1e7d2ef5ba66b" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.491111 4728 generic.go:334] "Generic (PLEG): container finished" podID="316025cd-8999-4601-a3df-4aaf1dad3a83" containerID="f567ac43ddaa2e5b598e1f0c130271c009569705fae00d3d3d8098c1a09fd023" exitCode=0 Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.491184 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tnpp2" event={"ID":"316025cd-8999-4601-a3df-4aaf1dad3a83","Type":"ContainerDied","Data":"f567ac43ddaa2e5b598e1f0c130271c009569705fae00d3d3d8098c1a09fd023"} Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.493985 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6r2wn" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.497568 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6r2wn" event={"ID":"4d114940-603d-4295-9d87-0ae17259f37c","Type":"ContainerDied","Data":"43d7dd6d7e16a95d59b9d3d86e7efbec2aa4de27083e6d51a5af471a3f4a730d"} Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.497617 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d7dd6d7e16a95d59b9d3d86e7efbec2aa4de27083e6d51a5af471a3f4a730d" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.511123 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gzgrb" podStartSLOduration=2.433594352 podStartE2EDuration="26.511099037s" podCreationTimestamp="2025-12-16 15:14:04 +0000 UTC" firstStartedPulling="2025-12-16 15:14:06.066354628 +0000 UTC m=+1026.906533612" lastFinishedPulling="2025-12-16 15:14:30.143859313 +0000 UTC m=+1050.984038297" observedRunningTime="2025-12-16 15:14:30.504281126 +0000 UTC m=+1051.344460130" watchObservedRunningTime="2025-12-16 15:14:30.511099037 +0000 UTC m=+1051.351278021" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.522043 4728 scope.go:117] "RemoveContainer" containerID="f4cd2463810d570605af672c0280bed5f051a2901ad811a329f0cd38cb34d9ad" Dec 16 15:14:30 crc kubenswrapper[4728]: E1216 15:14:30.522240 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-9d9zb" podUID="f82109b1-c2b6-462c-8857-d0d8b243f64a" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.657338 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jmrcl"] Dec 16 15:14:30 crc kubenswrapper[4728]: W1216 15:14:30.673974 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac195fba_37cf_48a1_aa91_c9df824ddfe4.slice/crio-19b2100c82be7ff4c53ba8fd42be218f3f279b39a0454a33a396c9b341caa8a2 WatchSource:0}: Error finding container 19b2100c82be7ff4c53ba8fd42be218f3f279b39a0454a33a396c9b341caa8a2: Status 404 returned error can't find the container with id 19b2100c82be7ff4c53ba8fd42be218f3f279b39a0454a33a396c9b341caa8a2 Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.679597 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jmrcl"] Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.685936 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7585b44dcb-46w99"] Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.692907 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-589dd4bc84-6zndr"] Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.876137 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6r2wn"] Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.884126 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6r2wn"] Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.984328 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.994009 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mwdss"] Dec 16 15:14:30 crc kubenswrapper[4728]: E1216 15:14:30.994432 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7035612-bffa-4357-aa7e-897240b10c43" containerName="dnsmasq-dns" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.994448 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7035612-bffa-4357-aa7e-897240b10c43" containerName="dnsmasq-dns" Dec 16 15:14:30 crc kubenswrapper[4728]: E1216 15:14:30.994467 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d114940-603d-4295-9d87-0ae17259f37c" containerName="keystone-bootstrap" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.994474 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d114940-603d-4295-9d87-0ae17259f37c" containerName="keystone-bootstrap" Dec 16 15:14:30 crc kubenswrapper[4728]: E1216 15:14:30.994491 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7035612-bffa-4357-aa7e-897240b10c43" containerName="init" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.994497 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7035612-bffa-4357-aa7e-897240b10c43" containerName="init" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.994671 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d114940-603d-4295-9d87-0ae17259f37c" containerName="keystone-bootstrap" Dec 16 15:14:30 crc kubenswrapper[4728]: I1216 15:14:30.994689 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7035612-bffa-4357-aa7e-897240b10c43" containerName="dnsmasq-dns" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.002590 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.004807 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.005215 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.005370 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.005579 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9cbd6" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.005767 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.022763 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mwdss"] Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.058263 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064063 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpljc\" (UniqueName: \"kubernetes.io/projected/d9542fe8-f57f-4771-ae28-700ce011aa51-kube-api-access-rpljc\") pod \"d9542fe8-f57f-4771-ae28-700ce011aa51\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064161 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9542fe8-f57f-4771-ae28-700ce011aa51-horizon-secret-key\") pod \"d9542fe8-f57f-4771-ae28-700ce011aa51\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064193 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a59cc6f-49ec-464b-9444-21f249ed771b-logs\") pod \"1a59cc6f-49ec-464b-9444-21f249ed771b\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064225 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-scripts\") pod \"1a59cc6f-49ec-464b-9444-21f249ed771b\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064270 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-config-data\") pod \"1a59cc6f-49ec-464b-9444-21f249ed771b\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064312 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsv2n\" (UniqueName: \"kubernetes.io/projected/1a59cc6f-49ec-464b-9444-21f249ed771b-kube-api-access-rsv2n\") pod \"1a59cc6f-49ec-464b-9444-21f249ed771b\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064344 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a59cc6f-49ec-464b-9444-21f249ed771b-horizon-secret-key\") pod \"1a59cc6f-49ec-464b-9444-21f249ed771b\" (UID: \"1a59cc6f-49ec-464b-9444-21f249ed771b\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064374 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9542fe8-f57f-4771-ae28-700ce011aa51-logs\") pod \"d9542fe8-f57f-4771-ae28-700ce011aa51\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064486 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-config-data\") pod \"d9542fe8-f57f-4771-ae28-700ce011aa51\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064537 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-scripts\") pod \"d9542fe8-f57f-4771-ae28-700ce011aa51\" (UID: \"d9542fe8-f57f-4771-ae28-700ce011aa51\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064855 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-config-data\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.064896 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-combined-ca-bundle\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.065022 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjqxp\" (UniqueName: \"kubernetes.io/projected/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-kube-api-access-bjqxp\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.065094 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-fernet-keys\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.066600 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a59cc6f-49ec-464b-9444-21f249ed771b-logs" (OuterVolumeSpecName: "logs") pod "1a59cc6f-49ec-464b-9444-21f249ed771b" (UID: "1a59cc6f-49ec-464b-9444-21f249ed771b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.067805 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-scripts" (OuterVolumeSpecName: "scripts") pod "d9542fe8-f57f-4771-ae28-700ce011aa51" (UID: "d9542fe8-f57f-4771-ae28-700ce011aa51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.067816 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9542fe8-f57f-4771-ae28-700ce011aa51-logs" (OuterVolumeSpecName: "logs") pod "d9542fe8-f57f-4771-ae28-700ce011aa51" (UID: "d9542fe8-f57f-4771-ae28-700ce011aa51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.068038 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-config-data" (OuterVolumeSpecName: "config-data") pod "1a59cc6f-49ec-464b-9444-21f249ed771b" (UID: "1a59cc6f-49ec-464b-9444-21f249ed771b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.069200 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-config-data" (OuterVolumeSpecName: "config-data") pod "d9542fe8-f57f-4771-ae28-700ce011aa51" (UID: "d9542fe8-f57f-4771-ae28-700ce011aa51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.072056 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a59cc6f-49ec-464b-9444-21f249ed771b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1a59cc6f-49ec-464b-9444-21f249ed771b" (UID: "1a59cc6f-49ec-464b-9444-21f249ed771b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.075188 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-scripts" (OuterVolumeSpecName: "scripts") pod "1a59cc6f-49ec-464b-9444-21f249ed771b" (UID: "1a59cc6f-49ec-464b-9444-21f249ed771b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.075308 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-credential-keys\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.075396 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-scripts\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.075566 4728 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a59cc6f-49ec-464b-9444-21f249ed771b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.075585 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9542fe8-f57f-4771-ae28-700ce011aa51-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.075596 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.075604 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9542fe8-f57f-4771-ae28-700ce011aa51-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.075613 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a59cc6f-49ec-464b-9444-21f249ed771b-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.075621 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.075631 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a59cc6f-49ec-464b-9444-21f249ed771b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.076254 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9542fe8-f57f-4771-ae28-700ce011aa51-kube-api-access-rpljc" (OuterVolumeSpecName: "kube-api-access-rpljc") pod "d9542fe8-f57f-4771-ae28-700ce011aa51" (UID: "d9542fe8-f57f-4771-ae28-700ce011aa51"). InnerVolumeSpecName "kube-api-access-rpljc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.076290 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.076647 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9542fe8-f57f-4771-ae28-700ce011aa51-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d9542fe8-f57f-4771-ae28-700ce011aa51" (UID: "d9542fe8-f57f-4771-ae28-700ce011aa51"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.081468 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a59cc6f-49ec-464b-9444-21f249ed771b-kube-api-access-rsv2n" (OuterVolumeSpecName: "kube-api-access-rsv2n") pod "1a59cc6f-49ec-464b-9444-21f249ed771b" (UID: "1a59cc6f-49ec-464b-9444-21f249ed771b"). InnerVolumeSpecName "kube-api-access-rsv2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177042 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-config-data\") pod \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177148 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-scripts\") pod \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177250 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-logs\") pod \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177294 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n45gn\" (UniqueName: \"kubernetes.io/projected/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-kube-api-access-n45gn\") pod \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177366 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-horizon-secret-key\") pod \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\" (UID: \"d6394bdf-e558-4cf1-93b5-7d84ae2318a3\") " Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177590 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-fernet-keys\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177622 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-credential-keys\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177639 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-scripts\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177668 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-config-data\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177697 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-combined-ca-bundle\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177797 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjqxp\" (UniqueName: \"kubernetes.io/projected/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-kube-api-access-bjqxp\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177866 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpljc\" (UniqueName: \"kubernetes.io/projected/d9542fe8-f57f-4771-ae28-700ce011aa51-kube-api-access-rpljc\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177880 4728 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9542fe8-f57f-4771-ae28-700ce011aa51-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177893 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsv2n\" (UniqueName: \"kubernetes.io/projected/1a59cc6f-49ec-464b-9444-21f249ed771b-kube-api-access-rsv2n\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.177971 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-logs" (OuterVolumeSpecName: "logs") pod "d6394bdf-e558-4cf1-93b5-7d84ae2318a3" (UID: "d6394bdf-e558-4cf1-93b5-7d84ae2318a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.179038 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-config-data" (OuterVolumeSpecName: "config-data") pod "d6394bdf-e558-4cf1-93b5-7d84ae2318a3" (UID: "d6394bdf-e558-4cf1-93b5-7d84ae2318a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.178997 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-scripts" (OuterVolumeSpecName: "scripts") pod "d6394bdf-e558-4cf1-93b5-7d84ae2318a3" (UID: "d6394bdf-e558-4cf1-93b5-7d84ae2318a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.181890 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-scripts\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.184588 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-fernet-keys\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.184590 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-config-data\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.187668 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d6394bdf-e558-4cf1-93b5-7d84ae2318a3" (UID: "d6394bdf-e558-4cf1-93b5-7d84ae2318a3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.188078 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-kube-api-access-n45gn" (OuterVolumeSpecName: "kube-api-access-n45gn") pod "d6394bdf-e558-4cf1-93b5-7d84ae2318a3" (UID: "d6394bdf-e558-4cf1-93b5-7d84ae2318a3"). InnerVolumeSpecName "kube-api-access-n45gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.191915 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-credential-keys\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.192493 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-combined-ca-bundle\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.198650 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjqxp\" (UniqueName: \"kubernetes.io/projected/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-kube-api-access-bjqxp\") pod \"keystone-bootstrap-mwdss\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.270399 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jmrcl" podUID="a7035612-bffa-4357-aa7e-897240b10c43" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.278947 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.278997 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.279008 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n45gn\" (UniqueName: \"kubernetes.io/projected/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-kube-api-access-n45gn\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.279020 4728 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.279032 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6394bdf-e558-4cf1-93b5-7d84ae2318a3-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.373330 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.510666 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777cc88b5c-n7899" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.517318 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c5fb7899f-kbxcj" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.520899 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f986cfd8f-glc7x" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.521595 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d114940-603d-4295-9d87-0ae17259f37c" path="/var/lib/kubelet/pods/4d114940-603d-4295-9d87-0ae17259f37c/volumes" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.522268 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7035612-bffa-4357-aa7e-897240b10c43" path="/var/lib/kubelet/pods/a7035612-bffa-4357-aa7e-897240b10c43/volumes" Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.523276 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777cc88b5c-n7899" event={"ID":"d9542fe8-f57f-4771-ae28-700ce011aa51","Type":"ContainerDied","Data":"f5ad846e3c0efa328591c8a95af670965eb9d818f2b3b3ffe444fcd952cd3e66"} Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.523300 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-589dd4bc84-6zndr" event={"ID":"f33646e3-23f5-40a1-88ef-f55bdd5a230c","Type":"ContainerStarted","Data":"f02920902c541c62d745a8a4b7de35807d1e8663e2730e585d9f0126a6e8e341"} Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.523310 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7585b44dcb-46w99" event={"ID":"ac195fba-37cf-48a1-aa91-c9df824ddfe4","Type":"ContainerStarted","Data":"19b2100c82be7ff4c53ba8fd42be218f3f279b39a0454a33a396c9b341caa8a2"} Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.523322 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c5fb7899f-kbxcj" event={"ID":"d6394bdf-e558-4cf1-93b5-7d84ae2318a3","Type":"ContainerDied","Data":"69c119f00f060d47820c26587cb6094102b8af264c23b9cb337e03deb7f8305d"} Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.523332 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f986cfd8f-glc7x" event={"ID":"1a59cc6f-49ec-464b-9444-21f249ed771b","Type":"ContainerDied","Data":"57b8ffb9cdc70792c55baa26e3a36933823d2103d4bea0d505b205bfc17db0cb"} Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.612908 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c5fb7899f-kbxcj"] Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.693359 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c5fb7899f-kbxcj"] Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.708519 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-777cc88b5c-n7899"] Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.730326 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-777cc88b5c-n7899"] Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.756559 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f986cfd8f-glc7x"] Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.769868 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f986cfd8f-glc7x"] Dec 16 15:14:31 crc kubenswrapper[4728]: I1216 15:14:31.885730 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mwdss"] Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.519305 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tnpp2" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.522666 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a59cc6f-49ec-464b-9444-21f249ed771b" path="/var/lib/kubelet/pods/1a59cc6f-49ec-464b-9444-21f249ed771b/volumes" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.523147 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6394bdf-e558-4cf1-93b5-7d84ae2318a3" path="/var/lib/kubelet/pods/d6394bdf-e558-4cf1-93b5-7d84ae2318a3/volumes" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.523656 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9542fe8-f57f-4771-ae28-700ce011aa51" path="/var/lib/kubelet/pods/d9542fe8-f57f-4771-ae28-700ce011aa51/volumes" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.593653 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mwdss" event={"ID":"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332","Type":"ContainerStarted","Data":"708ed264dcd28b6e46ec61b4ee0e47f0dd2072a6cc701827af5bcb1d61d744c1"} Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.596142 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tnpp2" event={"ID":"316025cd-8999-4601-a3df-4aaf1dad3a83","Type":"ContainerDied","Data":"3e6410007c065e2e20756141fec86ba7ee543c8f0c68dc7fd654ff5a00be8214"} Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.596160 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e6410007c065e2e20756141fec86ba7ee543c8f0c68dc7fd654ff5a00be8214" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.596225 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tnpp2" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.630213 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgx5h\" (UniqueName: \"kubernetes.io/projected/316025cd-8999-4601-a3df-4aaf1dad3a83-kube-api-access-jgx5h\") pod \"316025cd-8999-4601-a3df-4aaf1dad3a83\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.630377 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-db-sync-config-data\") pod \"316025cd-8999-4601-a3df-4aaf1dad3a83\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.630434 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-config-data\") pod \"316025cd-8999-4601-a3df-4aaf1dad3a83\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.630605 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-combined-ca-bundle\") pod \"316025cd-8999-4601-a3df-4aaf1dad3a83\" (UID: \"316025cd-8999-4601-a3df-4aaf1dad3a83\") " Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.646506 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316025cd-8999-4601-a3df-4aaf1dad3a83-kube-api-access-jgx5h" (OuterVolumeSpecName: "kube-api-access-jgx5h") pod "316025cd-8999-4601-a3df-4aaf1dad3a83" (UID: "316025cd-8999-4601-a3df-4aaf1dad3a83"). InnerVolumeSpecName "kube-api-access-jgx5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.648194 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "316025cd-8999-4601-a3df-4aaf1dad3a83" (UID: "316025cd-8999-4601-a3df-4aaf1dad3a83"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.710663 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "316025cd-8999-4601-a3df-4aaf1dad3a83" (UID: "316025cd-8999-4601-a3df-4aaf1dad3a83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.715991 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-config-data" (OuterVolumeSpecName: "config-data") pod "316025cd-8999-4601-a3df-4aaf1dad3a83" (UID: "316025cd-8999-4601-a3df-4aaf1dad3a83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.732208 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.732249 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgx5h\" (UniqueName: \"kubernetes.io/projected/316025cd-8999-4601-a3df-4aaf1dad3a83-kube-api-access-jgx5h\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.732263 4728 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:33 crc kubenswrapper[4728]: I1216 15:14:33.732276 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316025cd-8999-4601-a3df-4aaf1dad3a83-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.605181 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-589dd4bc84-6zndr" event={"ID":"f33646e3-23f5-40a1-88ef-f55bdd5a230c","Type":"ContainerStarted","Data":"416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770"} Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.605459 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-589dd4bc84-6zndr" event={"ID":"f33646e3-23f5-40a1-88ef-f55bdd5a230c","Type":"ContainerStarted","Data":"ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7"} Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.606729 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7585b44dcb-46w99" event={"ID":"ac195fba-37cf-48a1-aa91-c9df824ddfe4","Type":"ContainerStarted","Data":"bcb1cbc9e1c1ea992bcd689b4f211a16c5ebbea652b06ec3e784be478aa696b8"} Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.606787 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7585b44dcb-46w99" event={"ID":"ac195fba-37cf-48a1-aa91-c9df824ddfe4","Type":"ContainerStarted","Data":"5f3c604a6a7ca406b85f5e86b332173de06c90935cf1700c58c4433753a41e40"} Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.608282 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce","Type":"ContainerStarted","Data":"258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131"} Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.610666 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mwdss" event={"ID":"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332","Type":"ContainerStarted","Data":"29617b51e7b854b65239789ae0e78fedf8a5f8ed2142edf34a344ed2782a1b0b"} Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.644395 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-589dd4bc84-6zndr" podStartSLOduration=18.97695568 podStartE2EDuration="21.644364137s" podCreationTimestamp="2025-12-16 15:14:13 +0000 UTC" firstStartedPulling="2025-12-16 15:14:30.717215022 +0000 UTC m=+1051.557394006" lastFinishedPulling="2025-12-16 15:14:33.384623479 +0000 UTC m=+1054.224802463" observedRunningTime="2025-12-16 15:14:34.635384059 +0000 UTC m=+1055.475563063" watchObservedRunningTime="2025-12-16 15:14:34.644364137 +0000 UTC m=+1055.484543161" Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.698033 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mwdss" podStartSLOduration=4.698010743 podStartE2EDuration="4.698010743s" podCreationTimestamp="2025-12-16 15:14:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:34.666485365 +0000 UTC m=+1055.506664399" watchObservedRunningTime="2025-12-16 15:14:34.698010743 +0000 UTC m=+1055.538189727" Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.698650 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7585b44dcb-46w99" podStartSLOduration=19.125495535 podStartE2EDuration="21.698641289s" podCreationTimestamp="2025-12-16 15:14:13 +0000 UTC" firstStartedPulling="2025-12-16 15:14:30.679010047 +0000 UTC m=+1051.519189031" lastFinishedPulling="2025-12-16 15:14:33.252155761 +0000 UTC m=+1054.092334785" observedRunningTime="2025-12-16 15:14:34.693971945 +0000 UTC m=+1055.534150979" watchObservedRunningTime="2025-12-16 15:14:34.698641289 +0000 UTC m=+1055.538820273" Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.968331 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-qpqs4"] Dec 16 15:14:34 crc kubenswrapper[4728]: E1216 15:14:34.968713 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316025cd-8999-4601-a3df-4aaf1dad3a83" containerName="glance-db-sync" Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.968728 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="316025cd-8999-4601-a3df-4aaf1dad3a83" containerName="glance-db-sync" Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.968872 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="316025cd-8999-4601-a3df-4aaf1dad3a83" containerName="glance-db-sync" Dec 16 15:14:34 crc kubenswrapper[4728]: I1216 15:14:34.970329 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.022425 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-qpqs4"] Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.059550 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmz7c\" (UniqueName: \"kubernetes.io/projected/e1130519-ad80-4590-a993-f7ebaf324408-kube-api-access-gmz7c\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.059661 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-config\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.059689 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.059710 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.059860 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.059923 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.160884 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.160936 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.160984 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmz7c\" (UniqueName: \"kubernetes.io/projected/e1130519-ad80-4590-a993-f7ebaf324408-kube-api-access-gmz7c\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.161056 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-config\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.161086 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.161106 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.161765 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.162077 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.162324 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.162691 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-config\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.162811 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.177048 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmz7c\" (UniqueName: \"kubernetes.io/projected/e1130519-ad80-4590-a993-f7ebaf324408-kube-api-access-gmz7c\") pod \"dnsmasq-dns-785d8bcb8c-qpqs4\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.288305 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.780598 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-qpqs4"] Dec 16 15:14:35 crc kubenswrapper[4728]: W1216 15:14:35.795563 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1130519_ad80_4590_a993_f7ebaf324408.slice/crio-4eccd9270b61e7dd8c6888d5d46b077c7ec4a35658a4696f56543dc2cb1b6ecc WatchSource:0}: Error finding container 4eccd9270b61e7dd8c6888d5d46b077c7ec4a35658a4696f56543dc2cb1b6ecc: Status 404 returned error can't find the container with id 4eccd9270b61e7dd8c6888d5d46b077c7ec4a35658a4696f56543dc2cb1b6ecc Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.894525 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.896161 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.899878 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.900080 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.900214 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-npj6q" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.928760 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.976629 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-config-data\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.976677 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-logs\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.976734 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-scripts\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.976762 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b7lt\" (UniqueName: \"kubernetes.io/projected/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-kube-api-access-7b7lt\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.976786 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.976840 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:35 crc kubenswrapper[4728]: I1216 15:14:35.976913 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.077556 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.077642 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.077674 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-config-data\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.077689 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-logs\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.077721 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-scripts\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.077743 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b7lt\" (UniqueName: \"kubernetes.io/projected/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-kube-api-access-7b7lt\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.077761 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.078440 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.078664 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.080827 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-logs\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.089120 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.093461 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-config-data\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.103900 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-scripts\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.109140 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b7lt\" (UniqueName: \"kubernetes.io/projected/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-kube-api-access-7b7lt\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.122920 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.145712 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.147059 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.149861 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.155138 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.179065 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.179104 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-logs\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.179123 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.179142 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.179160 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.179201 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.179292 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk49d\" (UniqueName: \"kubernetes.io/projected/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-kube-api-access-nk49d\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.261995 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.283681 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk49d\" (UniqueName: \"kubernetes.io/projected/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-kube-api-access-nk49d\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.284504 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.284951 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-logs\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.284973 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.284991 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.285007 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.285084 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.285291 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-logs\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.284914 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.286279 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.289191 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.290573 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.292219 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.303110 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk49d\" (UniqueName: \"kubernetes.io/projected/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-kube-api-access-nk49d\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.321822 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.485512 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.642855 4728 generic.go:334] "Generic (PLEG): container finished" podID="e1130519-ad80-4590-a993-f7ebaf324408" containerID="30c310f67b6d4ea116cae808b3be9bfec1302e1b6d13bc6758631e60d64e558b" exitCode=0 Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.642891 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" event={"ID":"e1130519-ad80-4590-a993-f7ebaf324408","Type":"ContainerDied","Data":"30c310f67b6d4ea116cae808b3be9bfec1302e1b6d13bc6758631e60d64e558b"} Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.642917 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" event={"ID":"e1130519-ad80-4590-a993-f7ebaf324408","Type":"ContainerStarted","Data":"4eccd9270b61e7dd8c6888d5d46b077c7ec4a35658a4696f56543dc2cb1b6ecc"} Dec 16 15:14:36 crc kubenswrapper[4728]: I1216 15:14:36.904572 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:14:37 crc kubenswrapper[4728]: I1216 15:14:37.176865 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:14:37 crc kubenswrapper[4728]: I1216 15:14:37.652523 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a","Type":"ContainerStarted","Data":"47c75d9908c82f3a414194764bc21d56da50b18a0d0bc17ba9c5dafbc667950e"} Dec 16 15:14:37 crc kubenswrapper[4728]: I1216 15:14:37.654766 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" event={"ID":"e1130519-ad80-4590-a993-f7ebaf324408","Type":"ContainerStarted","Data":"d92266dda6f758648ff3612d11455320d2af4da2f9450f82688af7954645e282"} Dec 16 15:14:37 crc kubenswrapper[4728]: I1216 15:14:37.655038 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:37 crc kubenswrapper[4728]: I1216 15:14:37.678152 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" podStartSLOduration=3.678128374 podStartE2EDuration="3.678128374s" podCreationTimestamp="2025-12-16 15:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:37.670213775 +0000 UTC m=+1058.510392759" watchObservedRunningTime="2025-12-16 15:14:37.678128374 +0000 UTC m=+1058.518307358" Dec 16 15:14:37 crc kubenswrapper[4728]: I1216 15:14:37.979239 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:14:38 crc kubenswrapper[4728]: I1216 15:14:38.081020 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:14:38 crc kubenswrapper[4728]: I1216 15:14:38.666662 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a","Type":"ContainerStarted","Data":"3b8b1fb1e44f24e6a36fdd69c6d5c7be9d8658e29c2c028692fe2b898620a627"} Dec 16 15:14:38 crc kubenswrapper[4728]: I1216 15:14:38.819358 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:14:38 crc kubenswrapper[4728]: I1216 15:14:38.819426 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:14:38 crc kubenswrapper[4728]: I1216 15:14:38.819468 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:14:38 crc kubenswrapper[4728]: I1216 15:14:38.820105 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f528a37171fd283501ab52158c0534c2dc70337f5ffb233b47cd1885a45c673"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:14:38 crc kubenswrapper[4728]: I1216 15:14:38.820158 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://5f528a37171fd283501ab52158c0534c2dc70337f5ffb233b47cd1885a45c673" gracePeriod=600 Dec 16 15:14:39 crc kubenswrapper[4728]: I1216 15:14:39.676423 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5","Type":"ContainerStarted","Data":"1e67330cfd152eafb981a7539dac5805040814a2914bfed3831f4a4140e84ae1"} Dec 16 15:14:39 crc kubenswrapper[4728]: I1216 15:14:39.679373 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="5f528a37171fd283501ab52158c0534c2dc70337f5ffb233b47cd1885a45c673" exitCode=0 Dec 16 15:14:39 crc kubenswrapper[4728]: I1216 15:14:39.679435 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"5f528a37171fd283501ab52158c0534c2dc70337f5ffb233b47cd1885a45c673"} Dec 16 15:14:39 crc kubenswrapper[4728]: I1216 15:14:39.679612 4728 scope.go:117] "RemoveContainer" containerID="0cc664ff3879b159126f992f52a6c4ccf1fc8c0903483566c983c5026f497d68" Dec 16 15:14:40 crc kubenswrapper[4728]: I1216 15:14:40.689878 4728 generic.go:334] "Generic (PLEG): container finished" podID="f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332" containerID="29617b51e7b854b65239789ae0e78fedf8a5f8ed2142edf34a344ed2782a1b0b" exitCode=0 Dec 16 15:14:40 crc kubenswrapper[4728]: I1216 15:14:40.689942 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mwdss" event={"ID":"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332","Type":"ContainerDied","Data":"29617b51e7b854b65239789ae0e78fedf8a5f8ed2142edf34a344ed2782a1b0b"} Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.019886 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.101362 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-combined-ca-bundle\") pod \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.101459 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjqxp\" (UniqueName: \"kubernetes.io/projected/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-kube-api-access-bjqxp\") pod \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.101503 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-scripts\") pod \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.101525 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-fernet-keys\") pod \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.101574 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-config-data\") pod \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.101613 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-credential-keys\") pod \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\" (UID: \"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332\") " Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.121573 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-scripts" (OuterVolumeSpecName: "scripts") pod "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332" (UID: "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.121645 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-kube-api-access-bjqxp" (OuterVolumeSpecName: "kube-api-access-bjqxp") pod "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332" (UID: "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332"). InnerVolumeSpecName "kube-api-access-bjqxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.121731 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332" (UID: "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.127470 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332" (UID: "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.128582 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-config-data" (OuterVolumeSpecName: "config-data") pod "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332" (UID: "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.129716 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332" (UID: "f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.203710 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjqxp\" (UniqueName: \"kubernetes.io/projected/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-kube-api-access-bjqxp\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.203753 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.203766 4728 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.203779 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.203792 4728 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.203803 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.525793 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.526109 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.639025 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.639068 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.793764 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mwdss" event={"ID":"f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332","Type":"ContainerDied","Data":"708ed264dcd28b6e46ec61b4ee0e47f0dd2072a6cc701827af5bcb1d61d744c1"} Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.793804 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="708ed264dcd28b6e46ec61b4ee0e47f0dd2072a6cc701827af5bcb1d61d744c1" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.793867 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mwdss" Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.795820 4728 generic.go:334] "Generic (PLEG): container finished" podID="60e129cb-0ce5-4289-a50b-2513ab8ba750" containerID="c0977b96e79053722108b55a7d914b08ea2818872b7ad4424a80507ef69b89f8" exitCode=0 Dec 16 15:14:43 crc kubenswrapper[4728]: I1216 15:14:43.795878 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gzgrb" event={"ID":"60e129cb-0ce5-4289-a50b-2513ab8ba750","Type":"ContainerDied","Data":"c0977b96e79053722108b55a7d914b08ea2818872b7ad4424a80507ef69b89f8"} Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.205975 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5874cbd465-jjmn6"] Dec 16 15:14:44 crc kubenswrapper[4728]: E1216 15:14:44.206370 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332" containerName="keystone-bootstrap" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.206386 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332" containerName="keystone-bootstrap" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.206832 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332" containerName="keystone-bootstrap" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.207491 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.211819 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.212022 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.212182 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.212359 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9cbd6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.212541 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.212621 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.235400 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5874cbd465-jjmn6"] Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.322571 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-scripts\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.322627 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xfjs\" (UniqueName: \"kubernetes.io/projected/18996006-74fc-4090-941f-783741605f54-kube-api-access-6xfjs\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.322733 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-credential-keys\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.322862 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-fernet-keys\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.322986 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-internal-tls-certs\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.323047 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-public-tls-certs\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.323077 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-combined-ca-bundle\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.323130 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-config-data\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.425075 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-credential-keys\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.425159 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-fernet-keys\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.425210 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-internal-tls-certs\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.425233 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-public-tls-certs\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.425252 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-combined-ca-bundle\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.425278 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-config-data\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.425317 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-scripts\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.425339 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xfjs\" (UniqueName: \"kubernetes.io/projected/18996006-74fc-4090-941f-783741605f54-kube-api-access-6xfjs\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.431127 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-fernet-keys\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.433484 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-scripts\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.435354 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-combined-ca-bundle\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.435485 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-public-tls-certs\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.436019 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-credential-keys\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.436066 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-internal-tls-certs\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.454188 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18996006-74fc-4090-941f-783741605f54-config-data\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.465040 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xfjs\" (UniqueName: \"kubernetes.io/projected/18996006-74fc-4090-941f-783741605f54-kube-api-access-6xfjs\") pod \"keystone-5874cbd465-jjmn6\" (UID: \"18996006-74fc-4090-941f-783741605f54\") " pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:44 crc kubenswrapper[4728]: I1216 15:14:44.528861 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:45 crc kubenswrapper[4728]: I1216 15:14:45.290625 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:14:45 crc kubenswrapper[4728]: I1216 15:14:45.376508 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hsjfc"] Dec 16 15:14:45 crc kubenswrapper[4728]: I1216 15:14:45.376743 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" podUID="51a0c6b8-4f21-45f6-bfd7-e327b00f5399" containerName="dnsmasq-dns" containerID="cri-o://38c520442ab8dc341b03d0615ed829eeef1442a838e4f12e9a497b95fb526f4e" gracePeriod=10 Dec 16 15:14:46 crc kubenswrapper[4728]: I1216 15:14:46.830795 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gzgrb" event={"ID":"60e129cb-0ce5-4289-a50b-2513ab8ba750","Type":"ContainerDied","Data":"c27d2aadcc6166e6229f3a56c923d05530285d57d32a66907113231a9a043aaf"} Dec 16 15:14:46 crc kubenswrapper[4728]: I1216 15:14:46.831216 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c27d2aadcc6166e6229f3a56c923d05530285d57d32a66907113231a9a043aaf" Dec 16 15:14:46 crc kubenswrapper[4728]: I1216 15:14:46.835999 4728 generic.go:334] "Generic (PLEG): container finished" podID="51a0c6b8-4f21-45f6-bfd7-e327b00f5399" containerID="38c520442ab8dc341b03d0615ed829eeef1442a838e4f12e9a497b95fb526f4e" exitCode=0 Dec 16 15:14:46 crc kubenswrapper[4728]: I1216 15:14:46.836044 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" event={"ID":"51a0c6b8-4f21-45f6-bfd7-e327b00f5399","Type":"ContainerDied","Data":"38c520442ab8dc341b03d0615ed829eeef1442a838e4f12e9a497b95fb526f4e"} Dec 16 15:14:46 crc kubenswrapper[4728]: I1216 15:14:46.903350 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:46 crc kubenswrapper[4728]: I1216 15:14:46.983702 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-db-sync-config-data\") pod \"60e129cb-0ce5-4289-a50b-2513ab8ba750\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " Dec 16 15:14:46 crc kubenswrapper[4728]: I1216 15:14:46.983825 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfdsq\" (UniqueName: \"kubernetes.io/projected/60e129cb-0ce5-4289-a50b-2513ab8ba750-kube-api-access-mfdsq\") pod \"60e129cb-0ce5-4289-a50b-2513ab8ba750\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " Dec 16 15:14:46 crc kubenswrapper[4728]: I1216 15:14:46.983861 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-combined-ca-bundle\") pod \"60e129cb-0ce5-4289-a50b-2513ab8ba750\" (UID: \"60e129cb-0ce5-4289-a50b-2513ab8ba750\") " Dec 16 15:14:47 crc kubenswrapper[4728]: I1216 15:14:46.991927 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "60e129cb-0ce5-4289-a50b-2513ab8ba750" (UID: "60e129cb-0ce5-4289-a50b-2513ab8ba750"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:47 crc kubenswrapper[4728]: I1216 15:14:46.996279 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e129cb-0ce5-4289-a50b-2513ab8ba750-kube-api-access-mfdsq" (OuterVolumeSpecName: "kube-api-access-mfdsq") pod "60e129cb-0ce5-4289-a50b-2513ab8ba750" (UID: "60e129cb-0ce5-4289-a50b-2513ab8ba750"). InnerVolumeSpecName "kube-api-access-mfdsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:47 crc kubenswrapper[4728]: I1216 15:14:47.042721 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60e129cb-0ce5-4289-a50b-2513ab8ba750" (UID: "60e129cb-0ce5-4289-a50b-2513ab8ba750"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:47 crc kubenswrapper[4728]: I1216 15:14:47.086164 4728 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:47 crc kubenswrapper[4728]: I1216 15:14:47.086197 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfdsq\" (UniqueName: \"kubernetes.io/projected/60e129cb-0ce5-4289-a50b-2513ab8ba750-kube-api-access-mfdsq\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:47 crc kubenswrapper[4728]: I1216 15:14:47.086213 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e129cb-0ce5-4289-a50b-2513ab8ba750-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:47 crc kubenswrapper[4728]: I1216 15:14:47.299548 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.390367 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-svc\") pod \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.390429 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfpnd\" (UniqueName: \"kubernetes.io/projected/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-kube-api-access-kfpnd\") pod \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.390487 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-nb\") pod \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.390524 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-swift-storage-0\") pod \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.390658 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-config\") pod \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.390680 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-sb\") pod \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\" (UID: \"51a0c6b8-4f21-45f6-bfd7-e327b00f5399\") " Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.400977 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-kube-api-access-kfpnd" (OuterVolumeSpecName: "kube-api-access-kfpnd") pod "51a0c6b8-4f21-45f6-bfd7-e327b00f5399" (UID: "51a0c6b8-4f21-45f6-bfd7-e327b00f5399"). InnerVolumeSpecName "kube-api-access-kfpnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.458101 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "51a0c6b8-4f21-45f6-bfd7-e327b00f5399" (UID: "51a0c6b8-4f21-45f6-bfd7-e327b00f5399"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.462193 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51a0c6b8-4f21-45f6-bfd7-e327b00f5399" (UID: "51a0c6b8-4f21-45f6-bfd7-e327b00f5399"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.469075 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "51a0c6b8-4f21-45f6-bfd7-e327b00f5399" (UID: "51a0c6b8-4f21-45f6-bfd7-e327b00f5399"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.474040 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "51a0c6b8-4f21-45f6-bfd7-e327b00f5399" (UID: "51a0c6b8-4f21-45f6-bfd7-e327b00f5399"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.479299 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-config" (OuterVolumeSpecName: "config") pod "51a0c6b8-4f21-45f6-bfd7-e327b00f5399" (UID: "51a0c6b8-4f21-45f6-bfd7-e327b00f5399"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.493286 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.493317 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.493353 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.493368 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfpnd\" (UniqueName: \"kubernetes.io/projected/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-kube-api-access-kfpnd\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.493379 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.493390 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51a0c6b8-4f21-45f6-bfd7-e327b00f5399-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.548242 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5874cbd465-jjmn6"] Dec 16 15:14:51 crc kubenswrapper[4728]: W1216 15:14:47.555469 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18996006_74fc_4090_941f_783741605f54.slice/crio-1e3caa35790d92a87c884f2e86bd6ee6d78a86af83253e53f6493a10e0c589d4 WatchSource:0}: Error finding container 1e3caa35790d92a87c884f2e86bd6ee6d78a86af83253e53f6493a10e0c589d4: Status 404 returned error can't find the container with id 1e3caa35790d92a87c884f2e86bd6ee6d78a86af83253e53f6493a10e0c589d4 Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.844215 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5874cbd465-jjmn6" event={"ID":"18996006-74fc-4090-941f-783741605f54","Type":"ContainerStarted","Data":"1e3caa35790d92a87c884f2e86bd6ee6d78a86af83253e53f6493a10e0c589d4"} Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.847032 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gzgrb" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.847922 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.848632 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-hsjfc" event={"ID":"51a0c6b8-4f21-45f6-bfd7-e327b00f5399","Type":"ContainerDied","Data":"971c6362b1e86e4da020431547a22cf531ee9ca48728940865b69c36096fb79c"} Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.848656 4728 scope.go:117] "RemoveContainer" containerID="38c520442ab8dc341b03d0615ed829eeef1442a838e4f12e9a497b95fb526f4e" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.872597 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hsjfc"] Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.873765 4728 scope.go:117] "RemoveContainer" containerID="5c1d9754059a6b2191c6717276e1da8c47384dbe1fa68eb615636a1acff43027" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:47.879039 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-hsjfc"] Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.177262 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-86cff44659-k2jp2"] Dec 16 15:14:51 crc kubenswrapper[4728]: E1216 15:14:48.177594 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a0c6b8-4f21-45f6-bfd7-e327b00f5399" containerName="dnsmasq-dns" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.177607 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a0c6b8-4f21-45f6-bfd7-e327b00f5399" containerName="dnsmasq-dns" Dec 16 15:14:51 crc kubenswrapper[4728]: E1216 15:14:48.177622 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e129cb-0ce5-4289-a50b-2513ab8ba750" containerName="barbican-db-sync" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.177628 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e129cb-0ce5-4289-a50b-2513ab8ba750" containerName="barbican-db-sync" Dec 16 15:14:51 crc kubenswrapper[4728]: E1216 15:14:48.177646 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a0c6b8-4f21-45f6-bfd7-e327b00f5399" containerName="init" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.177653 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a0c6b8-4f21-45f6-bfd7-e327b00f5399" containerName="init" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.177813 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e129cb-0ce5-4289-a50b-2513ab8ba750" containerName="barbican-db-sync" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.177829 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a0c6b8-4f21-45f6-bfd7-e327b00f5399" containerName="dnsmasq-dns" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.178673 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.193998 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8qdc9" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.194185 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.194294 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.213679 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-config-data\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.213785 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-logs\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.213840 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmp44\" (UniqueName: \"kubernetes.io/projected/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-kube-api-access-nmp44\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.213867 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-config-data-custom\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.213935 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-combined-ca-bundle\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.214816 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-86cff44659-k2jp2"] Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.228228 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8957f9486-cds65"] Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.229611 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.241661 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.245165 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8957f9486-cds65"] Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.274463 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7f56"] Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.275808 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.293495 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7f56"] Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.315749 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3dd302c-4cb1-487b-9995-a99059ee9ac6-logs\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.315808 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-config\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.315828 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rktwf\" (UniqueName: \"kubernetes.io/projected/f3dd302c-4cb1-487b-9995-a99059ee9ac6-kube-api-access-rktwf\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.315850 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-config-data\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.315865 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.315913 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-252zz\" (UniqueName: \"kubernetes.io/projected/c659dc6b-019b-4cc8-81c5-2a7732c684c6-kube-api-access-252zz\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.315939 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-logs\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.315971 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmp44\" (UniqueName: \"kubernetes.io/projected/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-kube-api-access-nmp44\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.315989 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-config-data-custom\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.316007 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3dd302c-4cb1-487b-9995-a99059ee9ac6-config-data\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.316032 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.316048 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3dd302c-4cb1-487b-9995-a99059ee9ac6-combined-ca-bundle\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.316066 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3dd302c-4cb1-487b-9995-a99059ee9ac6-config-data-custom\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.316084 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.316116 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-combined-ca-bundle\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.316141 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.317159 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-logs\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.337981 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-combined-ca-bundle\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.338636 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-config-data\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.346024 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmp44\" (UniqueName: \"kubernetes.io/projected/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-kube-api-access-nmp44\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.358473 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3e0ec72-0e84-444e-a66f-50b4fe91adb5-config-data-custom\") pod \"barbican-worker-86cff44659-k2jp2\" (UID: \"e3e0ec72-0e84-444e-a66f-50b4fe91adb5\") " pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.390708 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f6d6c99cd-bmn5s"] Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.392094 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.395794 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.402035 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f6d6c99cd-bmn5s"] Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.418184 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.418217 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3dd302c-4cb1-487b-9995-a99059ee9ac6-logs\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.418259 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-config\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.418277 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rktwf\" (UniqueName: \"kubernetes.io/projected/f3dd302c-4cb1-487b-9995-a99059ee9ac6-kube-api-access-rktwf\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.418295 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.418338 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-252zz\" (UniqueName: \"kubernetes.io/projected/c659dc6b-019b-4cc8-81c5-2a7732c684c6-kube-api-access-252zz\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.418436 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3dd302c-4cb1-487b-9995-a99059ee9ac6-config-data\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.418459 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.418476 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3dd302c-4cb1-487b-9995-a99059ee9ac6-combined-ca-bundle\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.418495 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3dd302c-4cb1-487b-9995-a99059ee9ac6-config-data-custom\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.418513 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.419185 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3dd302c-4cb1-487b-9995-a99059ee9ac6-logs\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.419276 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.419457 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.419698 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-config\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.419821 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.420322 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.424313 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3dd302c-4cb1-487b-9995-a99059ee9ac6-config-data\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.424476 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3dd302c-4cb1-487b-9995-a99059ee9ac6-combined-ca-bundle\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.425788 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f3dd302c-4cb1-487b-9995-a99059ee9ac6-config-data-custom\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.443139 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rktwf\" (UniqueName: \"kubernetes.io/projected/f3dd302c-4cb1-487b-9995-a99059ee9ac6-kube-api-access-rktwf\") pod \"barbican-keystone-listener-8957f9486-cds65\" (UID: \"f3dd302c-4cb1-487b-9995-a99059ee9ac6\") " pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.444030 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-252zz\" (UniqueName: \"kubernetes.io/projected/c659dc6b-019b-4cc8-81c5-2a7732c684c6-kube-api-access-252zz\") pod \"dnsmasq-dns-586bdc5f9-s7f56\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.497495 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-86cff44659-k2jp2" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.520549 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28h7g\" (UniqueName: \"kubernetes.io/projected/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-kube-api-access-28h7g\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.520599 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data-custom\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.520670 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-combined-ca-bundle\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.520931 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-logs\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.521022 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.552214 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8957f9486-cds65" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.589963 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.622967 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.623078 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28h7g\" (UniqueName: \"kubernetes.io/projected/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-kube-api-access-28h7g\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.623110 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data-custom\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.623186 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-combined-ca-bundle\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.623291 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-logs\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.624687 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-logs\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.635141 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data-custom\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.636034 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.636530 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-combined-ca-bundle\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.644636 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28h7g\" (UniqueName: \"kubernetes.io/projected/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-kube-api-access-28h7g\") pod \"barbican-api-6f6d6c99cd-bmn5s\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:48.713716 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:49.523042 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a0c6b8-4f21-45f6-bfd7-e327b00f5399" path="/var/lib/kubelet/pods/51a0c6b8-4f21-45f6-bfd7-e327b00f5399/volumes" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:50.992665 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5","Type":"ContainerStarted","Data":"63e6f29857b4fd8707e7400872ca4f44bfe23c55f9bfb510998e566774bc4495"} Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.008329 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xfxvz" event={"ID":"d8cfd92c-8ec9-4d81-a119-2c35893fba2b","Type":"ContainerStarted","Data":"2785f5649eefb93fed84b8482c968f3ccd82dd418dc4a4324c25b8395214e30a"} Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.021057 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-76fcd78578-bhff6"] Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.022495 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.025349 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.025545 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.025745 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"b1b468897a2b4285ac91242e60a4e7ba38f4d070d647de3374233d1332ee4a0d"} Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.028418 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5874cbd465-jjmn6" event={"ID":"18996006-74fc-4090-941f-783741605f54","Type":"ContainerStarted","Data":"ff84e98c862e414f336c2403a582911d7a2a015c1c424f6faa99bd1c7037da57"} Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.028628 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.030090 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a","Type":"ContainerStarted","Data":"d7f552afb912a99990d8a0b141da623b2f13d2e448cfd154aac5c3a7552ebfe2"} Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.030265 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" containerName="glance-log" containerID="cri-o://3b8b1fb1e44f24e6a36fdd69c6d5c7be9d8658e29c2c028692fe2b898620a627" gracePeriod=30 Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.030290 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" containerName="glance-httpd" containerID="cri-o://d7f552afb912a99990d8a0b141da623b2f13d2e448cfd154aac5c3a7552ebfe2" gracePeriod=30 Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.040181 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76fcd78578-bhff6"] Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.075123 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xfxvz" podStartSLOduration=6.064768468 podStartE2EDuration="47.075104511s" podCreationTimestamp="2025-12-16 15:14:04 +0000 UTC" firstStartedPulling="2025-12-16 15:14:06.05398701 +0000 UTC m=+1026.894165994" lastFinishedPulling="2025-12-16 15:14:47.064323023 +0000 UTC m=+1067.904502037" observedRunningTime="2025-12-16 15:14:51.062807614 +0000 UTC m=+1071.902986598" watchObservedRunningTime="2025-12-16 15:14:51.075104511 +0000 UTC m=+1071.915283495" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.087743 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-combined-ca-bundle\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.087787 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djfhj\" (UniqueName: \"kubernetes.io/projected/4589b3db-cca9-45d9-a576-71188fd26cd1-kube-api-access-djfhj\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.087831 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4589b3db-cca9-45d9-a576-71188fd26cd1-logs\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.087860 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-internal-tls-certs\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.087887 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-config-data-custom\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.087972 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-public-tls-certs\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.088101 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-config-data\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.126940 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5874cbd465-jjmn6" podStartSLOduration=7.126920397 podStartE2EDuration="7.126920397s" podCreationTimestamp="2025-12-16 15:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:51.107916012 +0000 UTC m=+1071.948094996" watchObservedRunningTime="2025-12-16 15:14:51.126920397 +0000 UTC m=+1071.967099381" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.182711 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=17.182694898 podStartE2EDuration="17.182694898s" podCreationTimestamp="2025-12-16 15:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:51.153601686 +0000 UTC m=+1071.993780670" watchObservedRunningTime="2025-12-16 15:14:51.182694898 +0000 UTC m=+1072.022873882" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.189021 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-public-tls-certs\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.189075 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-config-data\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.189137 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-combined-ca-bundle\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.189159 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djfhj\" (UniqueName: \"kubernetes.io/projected/4589b3db-cca9-45d9-a576-71188fd26cd1-kube-api-access-djfhj\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.189189 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4589b3db-cca9-45d9-a576-71188fd26cd1-logs\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.189245 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-internal-tls-certs\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.189273 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-config-data-custom\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.197655 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-config-data-custom\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.199454 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4589b3db-cca9-45d9-a576-71188fd26cd1-logs\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.200163 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-combined-ca-bundle\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.202757 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-internal-tls-certs\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.205963 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-public-tls-certs\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.214425 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4589b3db-cca9-45d9-a576-71188fd26cd1-config-data\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.215534 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djfhj\" (UniqueName: \"kubernetes.io/projected/4589b3db-cca9-45d9-a576-71188fd26cd1-kube-api-access-djfhj\") pod \"barbican-api-76fcd78578-bhff6\" (UID: \"4589b3db-cca9-45d9-a576-71188fd26cd1\") " pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:51 crc kubenswrapper[4728]: I1216 15:14:51.391680 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.062276 4728 generic.go:334] "Generic (PLEG): container finished" podID="40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" containerID="d7f552afb912a99990d8a0b141da623b2f13d2e448cfd154aac5c3a7552ebfe2" exitCode=0 Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.062890 4728 generic.go:334] "Generic (PLEG): container finished" podID="40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" containerID="3b8b1fb1e44f24e6a36fdd69c6d5c7be9d8658e29c2c028692fe2b898620a627" exitCode=143 Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.062467 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a","Type":"ContainerDied","Data":"d7f552afb912a99990d8a0b141da623b2f13d2e448cfd154aac5c3a7552ebfe2"} Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.063004 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a","Type":"ContainerDied","Data":"3b8b1fb1e44f24e6a36fdd69c6d5c7be9d8658e29c2c028692fe2b898620a627"} Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.064891 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce","Type":"ContainerStarted","Data":"ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2"} Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.067337 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" containerName="glance-log" containerID="cri-o://63e6f29857b4fd8707e7400872ca4f44bfe23c55f9bfb510998e566774bc4495" gracePeriod=30 Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.067427 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5","Type":"ContainerStarted","Data":"89a781404f032a6a6d3ec67702ce01a597108ff3940618df25ff857f1a89ff6c"} Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.068121 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" containerName="glance-httpd" containerID="cri-o://89a781404f032a6a6d3ec67702ce01a597108ff3940618df25ff857f1a89ff6c" gracePeriod=30 Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.137596 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=17.13757655 podStartE2EDuration="17.13757655s" podCreationTimestamp="2025-12-16 15:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:52.093750856 +0000 UTC m=+1072.933929840" watchObservedRunningTime="2025-12-16 15:14:52.13757655 +0000 UTC m=+1072.977755534" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.153399 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8957f9486-cds65"] Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.161731 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-86cff44659-k2jp2"] Dec 16 15:14:52 crc kubenswrapper[4728]: W1216 15:14:52.162153 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3dd302c_4cb1_487b_9995_a99059ee9ac6.slice/crio-0e221d6a7871e47c688cd0622aa12b7884b4c5e355dbdf8b0edf7e6da73afea7 WatchSource:0}: Error finding container 0e221d6a7871e47c688cd0622aa12b7884b4c5e355dbdf8b0edf7e6da73afea7: Status 404 returned error can't find the container with id 0e221d6a7871e47c688cd0622aa12b7884b4c5e355dbdf8b0edf7e6da73afea7 Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.170012 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f6d6c99cd-bmn5s"] Dec 16 15:14:52 crc kubenswrapper[4728]: W1216 15:14:52.170606 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4602b2dd_bc4e_4d34_8f80_ffb2a267863d.slice/crio-eaa14cb09257a1db7ba2a6d9511e49c71b7abbe206ac7d67d2fc3a16528bb644 WatchSource:0}: Error finding container eaa14cb09257a1db7ba2a6d9511e49c71b7abbe206ac7d67d2fc3a16528bb644: Status 404 returned error can't find the container with id eaa14cb09257a1db7ba2a6d9511e49c71b7abbe206ac7d67d2fc3a16528bb644 Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.182634 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76fcd78578-bhff6"] Dec 16 15:14:52 crc kubenswrapper[4728]: W1216 15:14:52.183682 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4589b3db_cca9_45d9_a576_71188fd26cd1.slice/crio-6bfefd3ca8e67b3ccaf692f5d92336bf3cd9abb493114d5baedb5dcf4e27f808 WatchSource:0}: Error finding container 6bfefd3ca8e67b3ccaf692f5d92336bf3cd9abb493114d5baedb5dcf4e27f808: Status 404 returned error can't find the container with id 6bfefd3ca8e67b3ccaf692f5d92336bf3cd9abb493114d5baedb5dcf4e27f808 Dec 16 15:14:52 crc kubenswrapper[4728]: W1216 15:14:52.191480 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc659dc6b_019b_4cc8_81c5_2a7732c684c6.slice/crio-1067232c91febf7b86f917d6f8ff86b842cfd6e5f45a0a5eb4ffbce73245e8f1 WatchSource:0}: Error finding container 1067232c91febf7b86f917d6f8ff86b842cfd6e5f45a0a5eb4ffbce73245e8f1: Status 404 returned error can't find the container with id 1067232c91febf7b86f917d6f8ff86b842cfd6e5f45a0a5eb4ffbce73245e8f1 Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.192496 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7f56"] Dec 16 15:14:52 crc kubenswrapper[4728]: W1216 15:14:52.234062 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3e0ec72_0e84_444e_a66f_50b4fe91adb5.slice/crio-58c026e69abf9e21e4c5662a5f50bf0f37a0de2366c90f27269974f5e4f31e15 WatchSource:0}: Error finding container 58c026e69abf9e21e4c5662a5f50bf0f37a0de2366c90f27269974f5e4f31e15: Status 404 returned error can't find the container with id 58c026e69abf9e21e4c5662a5f50bf0f37a0de2366c90f27269974f5e4f31e15 Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.524333 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.734967 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.735453 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-config-data\") pod \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.735532 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b7lt\" (UniqueName: \"kubernetes.io/projected/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-kube-api-access-7b7lt\") pod \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.735559 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-scripts\") pod \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.735583 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-logs\") pod \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.735661 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-httpd-run\") pod \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.735692 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-combined-ca-bundle\") pod \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\" (UID: \"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a\") " Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.754729 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-logs" (OuterVolumeSpecName: "logs") pod "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" (UID: "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.754982 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" (UID: "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.765839 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" (UID: "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.770578 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-scripts" (OuterVolumeSpecName: "scripts") pod "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" (UID: "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.776684 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-kube-api-access-7b7lt" (OuterVolumeSpecName: "kube-api-access-7b7lt") pod "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" (UID: "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a"). InnerVolumeSpecName "kube-api-access-7b7lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.789662 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" (UID: "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.845450 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.845486 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b7lt\" (UniqueName: \"kubernetes.io/projected/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-kube-api-access-7b7lt\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.845497 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.845505 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.845513 4728 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.845520 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.918896 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.947728 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:52 crc kubenswrapper[4728]: I1216 15:14:52.973543 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-config-data" (OuterVolumeSpecName: "config-data") pod "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" (UID: "40f3cadf-05fb-4aeb-94dc-47c0cfbd535a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.050458 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.089114 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86cff44659-k2jp2" event={"ID":"e3e0ec72-0e84-444e-a66f-50b4fe91adb5","Type":"ContainerStarted","Data":"58c026e69abf9e21e4c5662a5f50bf0f37a0de2366c90f27269974f5e4f31e15"} Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.096837 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.098507 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"40f3cadf-05fb-4aeb-94dc-47c0cfbd535a","Type":"ContainerDied","Data":"47c75d9908c82f3a414194764bc21d56da50b18a0d0bc17ba9c5dafbc667950e"} Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.098581 4728 scope.go:117] "RemoveContainer" containerID="d7f552afb912a99990d8a0b141da623b2f13d2e448cfd154aac5c3a7552ebfe2" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.104785 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" event={"ID":"4602b2dd-bc4e-4d34-8f80-ffb2a267863d","Type":"ContainerStarted","Data":"aab515c1cadcd953b088a0fe2f17fd3d17753993e7b478607d37ea563fdccc0d"} Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.104852 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" event={"ID":"4602b2dd-bc4e-4d34-8f80-ffb2a267863d","Type":"ContainerStarted","Data":"eaa14cb09257a1db7ba2a6d9511e49c71b7abbe206ac7d67d2fc3a16528bb644"} Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.106875 4728 generic.go:334] "Generic (PLEG): container finished" podID="c659dc6b-019b-4cc8-81c5-2a7732c684c6" containerID="acd31ec7169929cc4145fca96653ce713cf810c753c49c9fe28898aba436f16f" exitCode=0 Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.106941 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" event={"ID":"c659dc6b-019b-4cc8-81c5-2a7732c684c6","Type":"ContainerDied","Data":"acd31ec7169929cc4145fca96653ce713cf810c753c49c9fe28898aba436f16f"} Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.106965 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" event={"ID":"c659dc6b-019b-4cc8-81c5-2a7732c684c6","Type":"ContainerStarted","Data":"1067232c91febf7b86f917d6f8ff86b842cfd6e5f45a0a5eb4ffbce73245e8f1"} Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.117292 4728 generic.go:334] "Generic (PLEG): container finished" podID="4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" containerID="89a781404f032a6a6d3ec67702ce01a597108ff3940618df25ff857f1a89ff6c" exitCode=0 Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.117331 4728 generic.go:334] "Generic (PLEG): container finished" podID="4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" containerID="63e6f29857b4fd8707e7400872ca4f44bfe23c55f9bfb510998e566774bc4495" exitCode=143 Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.117384 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5","Type":"ContainerDied","Data":"89a781404f032a6a6d3ec67702ce01a597108ff3940618df25ff857f1a89ff6c"} Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.117433 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5","Type":"ContainerDied","Data":"63e6f29857b4fd8707e7400872ca4f44bfe23c55f9bfb510998e566774bc4495"} Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.118848 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8957f9486-cds65" event={"ID":"f3dd302c-4cb1-487b-9995-a99059ee9ac6","Type":"ContainerStarted","Data":"0e221d6a7871e47c688cd0622aa12b7884b4c5e355dbdf8b0edf7e6da73afea7"} Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.125737 4728 scope.go:117] "RemoveContainer" containerID="3b8b1fb1e44f24e6a36fdd69c6d5c7be9d8658e29c2c028692fe2b898620a627" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.131724 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76fcd78578-bhff6" event={"ID":"4589b3db-cca9-45d9-a576-71188fd26cd1","Type":"ContainerStarted","Data":"d7be169cbddcacc4177283e6e9d3b9a863c97dbf8d8a8b479556422a2eee44a5"} Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.131762 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76fcd78578-bhff6" event={"ID":"4589b3db-cca9-45d9-a576-71188fd26cd1","Type":"ContainerStarted","Data":"6bfefd3ca8e67b3ccaf692f5d92336bf3cd9abb493114d5baedb5dcf4e27f808"} Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.154183 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.186794 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.196878 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:14:53 crc kubenswrapper[4728]: E1216 15:14:53.197301 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" containerName="glance-httpd" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.197318 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" containerName="glance-httpd" Dec 16 15:14:53 crc kubenswrapper[4728]: E1216 15:14:53.197343 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" containerName="glance-log" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.197350 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" containerName="glance-log" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.197541 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" containerName="glance-log" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.197561 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" containerName="glance-httpd" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.198458 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.207506 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.208322 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.209260 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.297637 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.360347 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-config-data\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.360391 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-scripts\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.360449 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.360482 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.360514 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.360579 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.360647 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f2l9\" (UniqueName: \"kubernetes.io/projected/955f80b9-933a-4583-92b7-f11c5ccd1bec-kube-api-access-7f2l9\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.360691 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-logs\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.461995 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-combined-ca-bundle\") pod \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.462663 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-logs" (OuterVolumeSpecName: "logs") pod "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" (UID: "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.463114 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-logs\") pod \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.463193 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-scripts\") pod \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.463256 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-httpd-run\") pod \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.463342 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-config-data\") pod \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.463421 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.463453 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk49d\" (UniqueName: \"kubernetes.io/projected/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-kube-api-access-nk49d\") pod \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\" (UID: \"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5\") " Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.463952 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.464076 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.464169 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f2l9\" (UniqueName: \"kubernetes.io/projected/955f80b9-933a-4583-92b7-f11c5ccd1bec-kube-api-access-7f2l9\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.464247 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-logs\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.464334 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-config-data\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.464360 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-scripts\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.464389 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.464447 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.464509 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.464872 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.464237 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" (UID: "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.470003 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.475542 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-kube-api-access-nk49d" (OuterVolumeSpecName: "kube-api-access-nk49d") pod "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" (UID: "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5"). InnerVolumeSpecName "kube-api-access-nk49d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.477432 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.479224 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" (UID: "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.479922 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.480227 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-logs\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.480900 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-scripts\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.483521 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-scripts" (OuterVolumeSpecName: "scripts") pod "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" (UID: "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.486706 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f2l9\" (UniqueName: \"kubernetes.io/projected/955f80b9-933a-4583-92b7-f11c5ccd1bec-kube-api-access-7f2l9\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.495504 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-config-data\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.502320 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.514807 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" (UID: "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.520164 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.528866 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40f3cadf-05fb-4aeb-94dc-47c0cfbd535a" path="/var/lib/kubelet/pods/40f3cadf-05fb-4aeb-94dc-47c0cfbd535a/volumes" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.536578 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-589dd4bc84-6zndr" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.566067 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.566102 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk49d\" (UniqueName: \"kubernetes.io/projected/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-kube-api-access-nk49d\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.566113 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.566125 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.566133 4728 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.624144 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.642544 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7585b44dcb-46w99" podUID="ac195fba-37cf-48a1-aa91-c9df824ddfe4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.651525 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-config-data" (OuterVolumeSpecName: "config-data") pod "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" (UID: "4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.667731 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:53 crc kubenswrapper[4728]: I1216 15:14:53.667781 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.133771 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:14:54 crc kubenswrapper[4728]: W1216 15:14:54.144918 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod955f80b9_933a_4583_92b7_f11c5ccd1bec.slice/crio-94d2d6edd402cf7b05d9f3b0d4ccab70bf6318c6fde6dba9d27cd00b55e0f405 WatchSource:0}: Error finding container 94d2d6edd402cf7b05d9f3b0d4ccab70bf6318c6fde6dba9d27cd00b55e0f405: Status 404 returned error can't find the container with id 94d2d6edd402cf7b05d9f3b0d4ccab70bf6318c6fde6dba9d27cd00b55e0f405 Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.148768 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" event={"ID":"4602b2dd-bc4e-4d34-8f80-ffb2a267863d","Type":"ContainerStarted","Data":"758378769e7341311cffbb157b7c1df2ffc8f4ac6d2f8c318e4c75a40a5fa0dd"} Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.148907 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.148938 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.153153 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" event={"ID":"c659dc6b-019b-4cc8-81c5-2a7732c684c6","Type":"ContainerStarted","Data":"0574e902672c8b10da2dda38fe7bb1222ec789e485bd15ed54f7030f6919a931"} Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.153399 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.155491 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5","Type":"ContainerDied","Data":"1e67330cfd152eafb981a7539dac5805040814a2914bfed3831f4a4140e84ae1"} Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.155538 4728 scope.go:117] "RemoveContainer" containerID="89a781404f032a6a6d3ec67702ce01a597108ff3940618df25ff857f1a89ff6c" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.155676 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.157625 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76fcd78578-bhff6" event={"ID":"4589b3db-cca9-45d9-a576-71188fd26cd1","Type":"ContainerStarted","Data":"11ad5e9af10de490055baf379baf3895a785dbacc049e1120d16f8af1e4736f6"} Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.158225 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.158274 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.176597 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podStartSLOduration=6.176580887 podStartE2EDuration="6.176580887s" podCreationTimestamp="2025-12-16 15:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:54.167686351 +0000 UTC m=+1075.007865345" watchObservedRunningTime="2025-12-16 15:14:54.176580887 +0000 UTC m=+1075.016759871" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.196810 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-76fcd78578-bhff6" podStartSLOduration=4.196791213 podStartE2EDuration="4.196791213s" podCreationTimestamp="2025-12-16 15:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:54.188650437 +0000 UTC m=+1075.028829451" watchObservedRunningTime="2025-12-16 15:14:54.196791213 +0000 UTC m=+1075.036970197" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.222928 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" podStartSLOduration=6.222908957 podStartE2EDuration="6.222908957s" podCreationTimestamp="2025-12-16 15:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:54.208876804 +0000 UTC m=+1075.049055798" watchObservedRunningTime="2025-12-16 15:14:54.222908957 +0000 UTC m=+1075.063087941" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.234438 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.244681 4728 scope.go:117] "RemoveContainer" containerID="63e6f29857b4fd8707e7400872ca4f44bfe23c55f9bfb510998e566774bc4495" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.247071 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.257479 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:14:54 crc kubenswrapper[4728]: E1216 15:14:54.257967 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" containerName="glance-httpd" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.257987 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" containerName="glance-httpd" Dec 16 15:14:54 crc kubenswrapper[4728]: E1216 15:14:54.258026 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" containerName="glance-log" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.258034 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" containerName="glance-log" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.258271 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" containerName="glance-log" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.258308 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" containerName="glance-httpd" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.259515 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.264782 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.264887 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.303705 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.391429 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-logs\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.391488 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp6xc\" (UniqueName: \"kubernetes.io/projected/916a6b2e-6b7b-457e-b2a2-80d02edc2217-kube-api-access-vp6xc\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.391521 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.391566 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.391631 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-scripts\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.391688 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.391739 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.391767 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-config-data\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.492910 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.492973 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.492995 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-config-data\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.493027 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-logs\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.493050 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6xc\" (UniqueName: \"kubernetes.io/projected/916a6b2e-6b7b-457e-b2a2-80d02edc2217-kube-api-access-vp6xc\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.493069 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.493099 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.493148 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-scripts\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.493751 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.536046 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-logs\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.544716 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.551189 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.551750 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp6xc\" (UniqueName: \"kubernetes.io/projected/916a6b2e-6b7b-457e-b2a2-80d02edc2217-kube-api-access-vp6xc\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.552598 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.558437 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-config-data\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.569853 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-scripts\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.579749 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:14:54 crc kubenswrapper[4728]: I1216 15:14:54.840639 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:14:55 crc kubenswrapper[4728]: I1216 15:14:55.167491 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"955f80b9-933a-4583-92b7-f11c5ccd1bec","Type":"ContainerStarted","Data":"54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c"} Dec 16 15:14:55 crc kubenswrapper[4728]: I1216 15:14:55.167835 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"955f80b9-933a-4583-92b7-f11c5ccd1bec","Type":"ContainerStarted","Data":"94d2d6edd402cf7b05d9f3b0d4ccab70bf6318c6fde6dba9d27cd00b55e0f405"} Dec 16 15:14:55 crc kubenswrapper[4728]: I1216 15:14:55.446798 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:14:55 crc kubenswrapper[4728]: I1216 15:14:55.520064 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5" path="/var/lib/kubelet/pods/4b901ad7-71ef-4891-a81d-ab9fd6e0eaa5/volumes" Dec 16 15:14:56 crc kubenswrapper[4728]: I1216 15:14:56.187440 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"955f80b9-933a-4583-92b7-f11c5ccd1bec","Type":"ContainerStarted","Data":"71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0"} Dec 16 15:14:56 crc kubenswrapper[4728]: I1216 15:14:56.189464 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"916a6b2e-6b7b-457e-b2a2-80d02edc2217","Type":"ContainerStarted","Data":"c2f27f86e8a584c665cd7d6538155dc1abeea1e78c5d29a69d32d58406d404cd"} Dec 16 15:14:56 crc kubenswrapper[4728]: I1216 15:14:56.189506 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"916a6b2e-6b7b-457e-b2a2-80d02edc2217","Type":"ContainerStarted","Data":"a82f5c2564c9cc5f92f2e539291846ae73ba3fc8979705dbe5d7044490e83272"} Dec 16 15:14:57 crc kubenswrapper[4728]: I1216 15:14:57.225064 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.225049025 podStartE2EDuration="4.225049025s" podCreationTimestamp="2025-12-16 15:14:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:14:57.221326846 +0000 UTC m=+1078.061505830" watchObservedRunningTime="2025-12-16 15:14:57.225049025 +0000 UTC m=+1078.065228009" Dec 16 15:14:58 crc kubenswrapper[4728]: I1216 15:14:58.592662 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:14:58 crc kubenswrapper[4728]: I1216 15:14:58.663764 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-qpqs4"] Dec 16 15:14:58 crc kubenswrapper[4728]: I1216 15:14:58.664028 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" podUID="e1130519-ad80-4590-a993-f7ebaf324408" containerName="dnsmasq-dns" containerID="cri-o://d92266dda6f758648ff3612d11455320d2af4da2f9450f82688af7954645e282" gracePeriod=10 Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.156090 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc"] Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.157378 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.160176 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.164881 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.171812 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc"] Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.289482 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" podUID="e1130519-ad80-4590-a993-f7ebaf324408" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.309169 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j6lc\" (UniqueName: \"kubernetes.io/projected/b159943c-5acb-49ba-951d-fb64f30525d2-kube-api-access-5j6lc\") pod \"collect-profiles-29431635-9jtzc\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.309227 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b159943c-5acb-49ba-951d-fb64f30525d2-secret-volume\") pod \"collect-profiles-29431635-9jtzc\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.309616 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b159943c-5acb-49ba-951d-fb64f30525d2-config-volume\") pod \"collect-profiles-29431635-9jtzc\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.411597 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b159943c-5acb-49ba-951d-fb64f30525d2-config-volume\") pod \"collect-profiles-29431635-9jtzc\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.411789 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j6lc\" (UniqueName: \"kubernetes.io/projected/b159943c-5acb-49ba-951d-fb64f30525d2-kube-api-access-5j6lc\") pod \"collect-profiles-29431635-9jtzc\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.411841 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b159943c-5acb-49ba-951d-fb64f30525d2-secret-volume\") pod \"collect-profiles-29431635-9jtzc\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.413022 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b159943c-5acb-49ba-951d-fb64f30525d2-config-volume\") pod \"collect-profiles-29431635-9jtzc\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.427842 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b159943c-5acb-49ba-951d-fb64f30525d2-secret-volume\") pod \"collect-profiles-29431635-9jtzc\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.430183 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j6lc\" (UniqueName: \"kubernetes.io/projected/b159943c-5acb-49ba-951d-fb64f30525d2-kube-api-access-5j6lc\") pod \"collect-profiles-29431635-9jtzc\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:00 crc kubenswrapper[4728]: I1216 15:15:00.479079 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:02 crc kubenswrapper[4728]: I1216 15:15:02.403600 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-76fcd78578-bhff6" podUID="4589b3db-cca9-45d9-a576-71188fd26cd1" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.154:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:15:02 crc kubenswrapper[4728]: I1216 15:15:02.463545 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-76fcd78578-bhff6" podUID="4589b3db-cca9-45d9-a576-71188fd26cd1" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 15:15:02 crc kubenswrapper[4728]: I1216 15:15:02.464914 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 15:15:02 crc kubenswrapper[4728]: I1216 15:15:02.468714 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 15:15:02 crc kubenswrapper[4728]: I1216 15:15:02.478143 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 16 15:15:02 crc kubenswrapper[4728]: I1216 15:15:02.486841 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.243766 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.255026 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76fcd78578-bhff6" Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.265911 4728 generic.go:334] "Generic (PLEG): container finished" podID="e1130519-ad80-4590-a993-f7ebaf324408" containerID="d92266dda6f758648ff3612d11455320d2af4da2f9450f82688af7954645e282" exitCode=0 Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.265993 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" event={"ID":"e1130519-ad80-4590-a993-f7ebaf324408","Type":"ContainerDied","Data":"d92266dda6f758648ff3612d11455320d2af4da2f9450f82688af7954645e282"} Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.340897 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f6d6c99cd-bmn5s"] Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.344147 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api-log" containerID="cri-o://aab515c1cadcd953b088a0fe2f17fd3d17753993e7b478607d37ea563fdccc0d" gracePeriod=30 Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.344620 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api" containerID="cri-o://758378769e7341311cffbb157b7c1df2ffc8f4ac6d2f8c318e4c75a40a5fa0dd" gracePeriod=30 Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.364604 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": EOF" Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.364605 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": EOF" Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.521660 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.522772 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.564785 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 15:15:03 crc kubenswrapper[4728]: I1216 15:15:03.572797 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 15:15:04 crc kubenswrapper[4728]: I1216 15:15:04.273383 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 15:15:04 crc kubenswrapper[4728]: I1216 15:15:04.273617 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 15:15:05 crc kubenswrapper[4728]: I1216 15:15:05.286711 4728 generic.go:334] "Generic (PLEG): container finished" podID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerID="aab515c1cadcd953b088a0fe2f17fd3d17753993e7b478607d37ea563fdccc0d" exitCode=143 Dec 16 15:15:05 crc kubenswrapper[4728]: I1216 15:15:05.286930 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" event={"ID":"4602b2dd-bc4e-4d34-8f80-ffb2a267863d","Type":"ContainerDied","Data":"aab515c1cadcd953b088a0fe2f17fd3d17753993e7b478607d37ea563fdccc0d"} Dec 16 15:15:05 crc kubenswrapper[4728]: I1216 15:15:05.289277 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" podUID="e1130519-ad80-4590-a993-f7ebaf324408" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Dec 16 15:15:05 crc kubenswrapper[4728]: I1216 15:15:05.566224 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:15:05 crc kubenswrapper[4728]: I1216 15:15:05.841645 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:15:06 crc kubenswrapper[4728]: I1216 15:15:06.588689 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 15:15:06 crc kubenswrapper[4728]: I1216 15:15:06.588805 4728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 15:15:06 crc kubenswrapper[4728]: I1216 15:15:06.592720 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 15:15:07 crc kubenswrapper[4728]: I1216 15:15:07.518124 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:15:07 crc kubenswrapper[4728]: I1216 15:15:07.702536 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7585b44dcb-46w99" Dec 16 15:15:07 crc kubenswrapper[4728]: I1216 15:15:07.771981 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-589dd4bc84-6zndr"] Dec 16 15:15:07 crc kubenswrapper[4728]: I1216 15:15:07.794938 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:34054->10.217.0.153:9311: read: connection reset by peer" Dec 16 15:15:07 crc kubenswrapper[4728]: I1216 15:15:07.794991 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:34042->10.217.0.153:9311: read: connection reset by peer" Dec 16 15:15:07 crc kubenswrapper[4728]: I1216 15:15:07.795773 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Dec 16 15:15:08 crc kubenswrapper[4728]: I1216 15:15:08.311345 4728 generic.go:334] "Generic (PLEG): container finished" podID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerID="758378769e7341311cffbb157b7c1df2ffc8f4ac6d2f8c318e4c75a40a5fa0dd" exitCode=0 Dec 16 15:15:08 crc kubenswrapper[4728]: I1216 15:15:08.311443 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" event={"ID":"4602b2dd-bc4e-4d34-8f80-ffb2a267863d","Type":"ContainerDied","Data":"758378769e7341311cffbb157b7c1df2ffc8f4ac6d2f8c318e4c75a40a5fa0dd"} Dec 16 15:15:08 crc kubenswrapper[4728]: I1216 15:15:08.317429 4728 generic.go:334] "Generic (PLEG): container finished" podID="d8cfd92c-8ec9-4d81-a119-2c35893fba2b" containerID="2785f5649eefb93fed84b8482c968f3ccd82dd418dc4a4324c25b8395214e30a" exitCode=0 Dec 16 15:15:08 crc kubenswrapper[4728]: I1216 15:15:08.317502 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xfxvz" event={"ID":"d8cfd92c-8ec9-4d81-a119-2c35893fba2b","Type":"ContainerDied","Data":"2785f5649eefb93fed84b8482c968f3ccd82dd418dc4a4324c25b8395214e30a"} Dec 16 15:15:08 crc kubenswrapper[4728]: I1216 15:15:08.317691 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-589dd4bc84-6zndr" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon-log" containerID="cri-o://ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7" gracePeriod=30 Dec 16 15:15:08 crc kubenswrapper[4728]: I1216 15:15:08.317756 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-589dd4bc84-6zndr" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon" containerID="cri-o://416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770" gracePeriod=30 Dec 16 15:15:08 crc kubenswrapper[4728]: I1216 15:15:08.724014 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Dec 16 15:15:08 crc kubenswrapper[4728]: I1216 15:15:08.724014 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Dec 16 15:15:10 crc kubenswrapper[4728]: I1216 15:15:10.290203 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" podUID="e1130519-ad80-4590-a993-f7ebaf324408" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Dec 16 15:15:10 crc kubenswrapper[4728]: I1216 15:15:10.290644 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:15:10 crc kubenswrapper[4728]: I1216 15:15:10.334375 4728 generic.go:334] "Generic (PLEG): container finished" podID="04fe707b-a597-4768-8190-6efb7aea9faa" containerID="330bb68c46623995b5939a30a6e76c4843fed254b676083b2d151a5ad3c2433e" exitCode=0 Dec 16 15:15:10 crc kubenswrapper[4728]: I1216 15:15:10.334434 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wcv69" event={"ID":"04fe707b-a597-4768-8190-6efb7aea9faa","Type":"ContainerDied","Data":"330bb68c46623995b5939a30a6e76c4843fed254b676083b2d151a5ad3c2433e"} Dec 16 15:15:12 crc kubenswrapper[4728]: I1216 15:15:12.350390 4728 generic.go:334] "Generic (PLEG): container finished" podID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerID="416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770" exitCode=0 Dec 16 15:15:12 crc kubenswrapper[4728]: I1216 15:15:12.350442 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-589dd4bc84-6zndr" event={"ID":"f33646e3-23f5-40a1-88ef-f55bdd5a230c","Type":"ContainerDied","Data":"416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770"} Dec 16 15:15:13 crc kubenswrapper[4728]: I1216 15:15:13.525343 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-589dd4bc84-6zndr" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 16 15:15:13 crc kubenswrapper[4728]: E1216 15:15:13.658111 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Dec 16 15:15:13 crc kubenswrapper[4728]: E1216 15:15:13.658320 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9h2w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 16 15:15:13 crc kubenswrapper[4728]: E1216 15:15:13.659627 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" Dec 16 15:15:13 crc kubenswrapper[4728]: I1216 15:15:13.714243 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Dec 16 15:15:13 crc kubenswrapper[4728]: I1216 15:15:13.714359 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:15:13 crc kubenswrapper[4728]: I1216 15:15:13.714560 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.367136 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" containerName="ceilometer-notification-agent" containerID="cri-o://258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131" gracePeriod=30 Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.367206 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" containerName="sg-core" containerID="cri-o://ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2" gracePeriod=30 Dec 16 15:15:14 crc kubenswrapper[4728]: E1216 15:15:14.632592 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 16 15:15:14 crc kubenswrapper[4728]: E1216 15:15:14.632780 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mzdz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9d9zb_openstack(f82109b1-c2b6-462c-8857-d0d8b243f64a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:15:14 crc kubenswrapper[4728]: E1216 15:15:14.634169 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9d9zb" podUID="f82109b1-c2b6-462c-8857-d0d8b243f64a" Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.809898 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xfxvz" Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.827018 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wcv69" Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.860563 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983595 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-swift-storage-0\") pod \"e1130519-ad80-4590-a993-f7ebaf324408\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983663 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-config\") pod \"e1130519-ad80-4590-a993-f7ebaf324408\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983745 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccc6l\" (UniqueName: \"kubernetes.io/projected/04fe707b-a597-4768-8190-6efb7aea9faa-kube-api-access-ccc6l\") pod \"04fe707b-a597-4768-8190-6efb7aea9faa\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983759 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-svc\") pod \"e1130519-ad80-4590-a993-f7ebaf324408\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983795 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-logs\") pod \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983821 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-combined-ca-bundle\") pod \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983837 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-sb\") pod \"e1130519-ad80-4590-a993-f7ebaf324408\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983870 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmz7c\" (UniqueName: \"kubernetes.io/projected/e1130519-ad80-4590-a993-f7ebaf324408-kube-api-access-gmz7c\") pod \"e1130519-ad80-4590-a993-f7ebaf324408\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983894 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-combined-ca-bundle\") pod \"04fe707b-a597-4768-8190-6efb7aea9faa\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983911 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-scripts\") pod \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983932 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-nb\") pod \"e1130519-ad80-4590-a993-f7ebaf324408\" (UID: \"e1130519-ad80-4590-a993-f7ebaf324408\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983955 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr7nn\" (UniqueName: \"kubernetes.io/projected/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-kube-api-access-hr7nn\") pod \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.983981 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-config-data\") pod \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\" (UID: \"d8cfd92c-8ec9-4d81-a119-2c35893fba2b\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.984032 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-config\") pod \"04fe707b-a597-4768-8190-6efb7aea9faa\" (UID: \"04fe707b-a597-4768-8190-6efb7aea9faa\") " Dec 16 15:15:14 crc kubenswrapper[4728]: I1216 15:15:14.992211 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-logs" (OuterVolumeSpecName: "logs") pod "d8cfd92c-8ec9-4d81-a119-2c35893fba2b" (UID: "d8cfd92c-8ec9-4d81-a119-2c35893fba2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.010965 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-kube-api-access-hr7nn" (OuterVolumeSpecName: "kube-api-access-hr7nn") pod "d8cfd92c-8ec9-4d81-a119-2c35893fba2b" (UID: "d8cfd92c-8ec9-4d81-a119-2c35893fba2b"). InnerVolumeSpecName "kube-api-access-hr7nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.011259 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fe707b-a597-4768-8190-6efb7aea9faa-kube-api-access-ccc6l" (OuterVolumeSpecName: "kube-api-access-ccc6l") pod "04fe707b-a597-4768-8190-6efb7aea9faa" (UID: "04fe707b-a597-4768-8190-6efb7aea9faa"). InnerVolumeSpecName "kube-api-access-ccc6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.013717 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1130519-ad80-4590-a993-f7ebaf324408-kube-api-access-gmz7c" (OuterVolumeSpecName: "kube-api-access-gmz7c") pod "e1130519-ad80-4590-a993-f7ebaf324408" (UID: "e1130519-ad80-4590-a993-f7ebaf324408"). InnerVolumeSpecName "kube-api-access-gmz7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.015563 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-scripts" (OuterVolumeSpecName: "scripts") pod "d8cfd92c-8ec9-4d81-a119-2c35893fba2b" (UID: "d8cfd92c-8ec9-4d81-a119-2c35893fba2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.022184 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.086215 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmz7c\" (UniqueName: \"kubernetes.io/projected/e1130519-ad80-4590-a993-f7ebaf324408-kube-api-access-gmz7c\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.086275 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.086288 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr7nn\" (UniqueName: \"kubernetes.io/projected/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-kube-api-access-hr7nn\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.086303 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccc6l\" (UniqueName: \"kubernetes.io/projected/04fe707b-a597-4768-8190-6efb7aea9faa-kube-api-access-ccc6l\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.086315 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.184296 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8cfd92c-8ec9-4d81-a119-2c35893fba2b" (UID: "d8cfd92c-8ec9-4d81-a119-2c35893fba2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.187125 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-combined-ca-bundle\") pod \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.187192 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-logs\") pod \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.187273 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data-custom\") pod \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.187340 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data\") pod \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.187366 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28h7g\" (UniqueName: \"kubernetes.io/projected/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-kube-api-access-28h7g\") pod \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\" (UID: \"4602b2dd-bc4e-4d34-8f80-ffb2a267863d\") " Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.189515 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.192285 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-logs" (OuterVolumeSpecName: "logs") pod "4602b2dd-bc4e-4d34-8f80-ffb2a267863d" (UID: "4602b2dd-bc4e-4d34-8f80-ffb2a267863d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.213282 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc"] Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.227096 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-kube-api-access-28h7g" (OuterVolumeSpecName: "kube-api-access-28h7g") pod "4602b2dd-bc4e-4d34-8f80-ffb2a267863d" (UID: "4602b2dd-bc4e-4d34-8f80-ffb2a267863d"). InnerVolumeSpecName "kube-api-access-28h7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.227101 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4602b2dd-bc4e-4d34-8f80-ffb2a267863d" (UID: "4602b2dd-bc4e-4d34-8f80-ffb2a267863d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.282251 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-config-data" (OuterVolumeSpecName: "config-data") pod "d8cfd92c-8ec9-4d81-a119-2c35893fba2b" (UID: "d8cfd92c-8ec9-4d81-a119-2c35893fba2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.292370 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8cfd92c-8ec9-4d81-a119-2c35893fba2b-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.292453 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.292466 4728 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.292479 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28h7g\" (UniqueName: \"kubernetes.io/projected/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-kube-api-access-28h7g\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.292746 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-config" (OuterVolumeSpecName: "config") pod "04fe707b-a597-4768-8190-6efb7aea9faa" (UID: "04fe707b-a597-4768-8190-6efb7aea9faa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.305543 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04fe707b-a597-4768-8190-6efb7aea9faa" (UID: "04fe707b-a597-4768-8190-6efb7aea9faa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.314511 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1130519-ad80-4590-a993-f7ebaf324408" (UID: "e1130519-ad80-4590-a993-f7ebaf324408"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.329962 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4602b2dd-bc4e-4d34-8f80-ffb2a267863d" (UID: "4602b2dd-bc4e-4d34-8f80-ffb2a267863d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.330644 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1130519-ad80-4590-a993-f7ebaf324408" (UID: "e1130519-ad80-4590-a993-f7ebaf324408"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.330758 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data" (OuterVolumeSpecName: "config-data") pod "4602b2dd-bc4e-4d34-8f80-ffb2a267863d" (UID: "4602b2dd-bc4e-4d34-8f80-ffb2a267863d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.331445 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1130519-ad80-4590-a993-f7ebaf324408" (UID: "e1130519-ad80-4590-a993-f7ebaf324408"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.331657 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-config" (OuterVolumeSpecName: "config") pod "e1130519-ad80-4590-a993-f7ebaf324408" (UID: "e1130519-ad80-4590-a993-f7ebaf324408"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.332556 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1130519-ad80-4590-a993-f7ebaf324408" (UID: "e1130519-ad80-4590-a993-f7ebaf324408"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.376353 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" event={"ID":"4602b2dd-bc4e-4d34-8f80-ffb2a267863d","Type":"ContainerDied","Data":"eaa14cb09257a1db7ba2a6d9511e49c71b7abbe206ac7d67d2fc3a16528bb644"} Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.377579 4728 scope.go:117] "RemoveContainer" containerID="758378769e7341311cffbb157b7c1df2ffc8f4ac6d2f8c318e4c75a40a5fa0dd" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.376740 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f6d6c99cd-bmn5s" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.381230 4728 generic.go:334] "Generic (PLEG): container finished" podID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" containerID="ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2" exitCode=2 Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.381357 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce","Type":"ContainerDied","Data":"ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2"} Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.383815 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wcv69" event={"ID":"04fe707b-a597-4768-8190-6efb7aea9faa","Type":"ContainerDied","Data":"2c86b28d66820af505511bc5e5f98be31d25d21a265ad6e6a9cfe2fa63533efa"} Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.383953 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c86b28d66820af505511bc5e5f98be31d25d21a265ad6e6a9cfe2fa63533efa" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.383856 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wcv69" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.390181 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" event={"ID":"b159943c-5acb-49ba-951d-fb64f30525d2","Type":"ContainerStarted","Data":"b3817a126bb21f2cda6e98c06e6c6c18381bec0a642787fbadd5b8159ae727ef"} Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.393671 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.393828 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.393890 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.393952 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.394034 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/04fe707b-a597-4768-8190-6efb7aea9faa-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.394089 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.394142 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.394194 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4602b2dd-bc4e-4d34-8f80-ffb2a267863d-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.394253 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1130519-ad80-4590-a993-f7ebaf324408-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.397102 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.397690 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-qpqs4" event={"ID":"e1130519-ad80-4590-a993-f7ebaf324408","Type":"ContainerDied","Data":"4eccd9270b61e7dd8c6888d5d46b077c7ec4a35658a4696f56543dc2cb1b6ecc"} Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.399037 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xfxvz" event={"ID":"d8cfd92c-8ec9-4d81-a119-2c35893fba2b","Type":"ContainerDied","Data":"84cba1f7ac61b4c0c1fddb66e5cfb96cd040347a1f4a5e2e2d38d346dc6f3c0c"} Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.399129 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84cba1f7ac61b4c0c1fddb66e5cfb96cd040347a1f4a5e2e2d38d346dc6f3c0c" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.399244 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xfxvz" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.405051 4728 scope.go:117] "RemoveContainer" containerID="aab515c1cadcd953b088a0fe2f17fd3d17753993e7b478607d37ea563fdccc0d" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.439134 4728 scope.go:117] "RemoveContainer" containerID="d92266dda6f758648ff3612d11455320d2af4da2f9450f82688af7954645e282" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.457178 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f6d6c99cd-bmn5s"] Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.460359 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6f6d6c99cd-bmn5s"] Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.461736 4728 scope.go:117] "RemoveContainer" containerID="30c310f67b6d4ea116cae808b3be9bfec1302e1b6d13bc6758631e60d64e558b" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.466598 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-qpqs4"] Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.473067 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-qpqs4"] Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.518748 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" path="/var/lib/kubelet/pods/4602b2dd-bc4e-4d34-8f80-ffb2a267863d/volumes" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.519676 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1130519-ad80-4590-a993-f7ebaf324408" path="/var/lib/kubelet/pods/e1130519-ad80-4590-a993-f7ebaf324408/volumes" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.953822 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7dcd7544cd-gnxgg"] Dec 16 15:15:15 crc kubenswrapper[4728]: E1216 15:15:15.956122 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04fe707b-a597-4768-8190-6efb7aea9faa" containerName="neutron-db-sync" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.956256 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fe707b-a597-4768-8190-6efb7aea9faa" containerName="neutron-db-sync" Dec 16 15:15:15 crc kubenswrapper[4728]: E1216 15:15:15.956361 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8cfd92c-8ec9-4d81-a119-2c35893fba2b" containerName="placement-db-sync" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.956524 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8cfd92c-8ec9-4d81-a119-2c35893fba2b" containerName="placement-db-sync" Dec 16 15:15:15 crc kubenswrapper[4728]: E1216 15:15:15.956666 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.956765 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api" Dec 16 15:15:15 crc kubenswrapper[4728]: E1216 15:15:15.956870 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api-log" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.957027 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api-log" Dec 16 15:15:15 crc kubenswrapper[4728]: E1216 15:15:15.957161 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1130519-ad80-4590-a993-f7ebaf324408" containerName="dnsmasq-dns" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.957266 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1130519-ad80-4590-a993-f7ebaf324408" containerName="dnsmasq-dns" Dec 16 15:15:15 crc kubenswrapper[4728]: E1216 15:15:15.957374 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1130519-ad80-4590-a993-f7ebaf324408" containerName="init" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.957535 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1130519-ad80-4590-a993-f7ebaf324408" containerName="init" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.957915 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.958043 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1130519-ad80-4590-a993-f7ebaf324408" containerName="dnsmasq-dns" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.958173 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="04fe707b-a597-4768-8190-6efb7aea9faa" containerName="neutron-db-sync" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.958297 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8cfd92c-8ec9-4d81-a119-2c35893fba2b" containerName="placement-db-sync" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.958450 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="4602b2dd-bc4e-4d34-8f80-ffb2a267863d" containerName="barbican-api-log" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.959934 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.962253 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.963920 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7dcd7544cd-gnxgg"] Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.965152 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.965158 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.965267 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 16 15:15:15 crc kubenswrapper[4728]: I1216 15:15:15.965316 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hrsxm" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.081507 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tv7cs"] Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.085958 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.109020 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tv7cs"] Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.119301 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-config-data\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.119350 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac43e45-8d37-4ab4-9ebe-441421fe9044-logs\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.119423 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8xbz\" (UniqueName: \"kubernetes.io/projected/7ac43e45-8d37-4ab4-9ebe-441421fe9044-kube-api-access-d8xbz\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.119451 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-combined-ca-bundle\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.119477 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-scripts\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.119500 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-internal-tls-certs\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.119546 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-public-tls-certs\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.135617 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bb7b6f474-4bf42"] Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.136906 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.139780 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.141037 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tkr4w" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.141476 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.141581 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.153258 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bb7b6f474-4bf42"] Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.220989 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ps8t\" (UniqueName: \"kubernetes.io/projected/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-kube-api-access-7ps8t\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221039 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-config\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221087 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-public-tls-certs\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221121 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221148 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221169 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-config-data\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221197 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac43e45-8d37-4ab4-9ebe-441421fe9044-logs\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221262 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8xbz\" (UniqueName: \"kubernetes.io/projected/7ac43e45-8d37-4ab4-9ebe-441421fe9044-kube-api-access-d8xbz\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221280 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221304 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-combined-ca-bundle\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221333 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-scripts\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221355 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.221379 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-internal-tls-certs\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.223365 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac43e45-8d37-4ab4-9ebe-441421fe9044-logs\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.228293 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-public-tls-certs\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.228690 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-config-data\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.228811 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-internal-tls-certs\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.232510 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-combined-ca-bundle\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.237759 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ac43e45-8d37-4ab4-9ebe-441421fe9044-scripts\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.241008 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8xbz\" (UniqueName: \"kubernetes.io/projected/7ac43e45-8d37-4ab4-9ebe-441421fe9044-kube-api-access-d8xbz\") pod \"placement-7dcd7544cd-gnxgg\" (UID: \"7ac43e45-8d37-4ab4-9ebe-441421fe9044\") " pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.295257 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.322609 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.322702 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.322754 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ps8t\" (UniqueName: \"kubernetes.io/projected/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-kube-api-access-7ps8t\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.322780 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-config\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.322824 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82tlw\" (UniqueName: \"kubernetes.io/projected/8afc45c2-e8b6-4886-aa09-87ff2f284587-kube-api-access-82tlw\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.322858 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-httpd-config\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.322883 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-ovndb-tls-certs\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.322907 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-combined-ca-bundle\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.323046 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.323102 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.323138 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-config\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.323660 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.323877 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.323928 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-config\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.324396 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.324479 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.347418 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ps8t\" (UniqueName: \"kubernetes.io/projected/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-kube-api-access-7ps8t\") pod \"dnsmasq-dns-85ff748b95-tv7cs\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.403702 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.425731 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82tlw\" (UniqueName: \"kubernetes.io/projected/8afc45c2-e8b6-4886-aa09-87ff2f284587-kube-api-access-82tlw\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.425773 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-httpd-config\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.425791 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-ovndb-tls-certs\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.425813 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-combined-ca-bundle\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.425848 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-config\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.430087 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-httpd-config\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.431911 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-ovndb-tls-certs\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.432622 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-config\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.434307 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-combined-ca-bundle\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.446788 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82tlw\" (UniqueName: \"kubernetes.io/projected/8afc45c2-e8b6-4886-aa09-87ff2f284587-kube-api-access-82tlw\") pod \"neutron-6bb7b6f474-4bf42\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.464424 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.769098 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7dcd7544cd-gnxgg"] Dec 16 15:15:16 crc kubenswrapper[4728]: I1216 15:15:16.892000 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tv7cs"] Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.138212 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bb7b6f474-4bf42"] Dec 16 15:15:17 crc kubenswrapper[4728]: W1216 15:15:17.151238 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8afc45c2_e8b6_4886_aa09_87ff2f284587.slice/crio-60706bcc752a964d0b44edd92fa8b274b4fa219e87a05030ac3b15c2907c4636 WatchSource:0}: Error finding container 60706bcc752a964d0b44edd92fa8b274b4fa219e87a05030ac3b15c2907c4636: Status 404 returned error can't find the container with id 60706bcc752a964d0b44edd92fa8b274b4fa219e87a05030ac3b15c2907c4636 Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.336012 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5874cbd465-jjmn6" Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.441355 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8957f9486-cds65" event={"ID":"f3dd302c-4cb1-487b-9995-a99059ee9ac6","Type":"ContainerStarted","Data":"372e9d54009a215be2e69cfb600b6efe0c30f88a9cb4b57bc6ed2c2abcfa17d8"} Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.441711 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8957f9486-cds65" event={"ID":"f3dd302c-4cb1-487b-9995-a99059ee9ac6","Type":"ContainerStarted","Data":"6651b5be2d964b1e3e1490b990f157659b320177be41bbf10a6b863f2a943be9"} Dec 16 15:15:17 crc kubenswrapper[4728]: E1216 15:15:17.449458 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6964d7cb_c1bb_4296_ad17_d56280a0e8f0.slice/crio-conmon-bec4747c7d00dee4e3413fdb56fb8612b07fcff3c1529bc844705155109166a3.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.450034 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"916a6b2e-6b7b-457e-b2a2-80d02edc2217","Type":"ContainerStarted","Data":"238866146b238760e52aa044cfbec6bb7112b3a6ebc78826658a15ac386aba0a"} Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.451954 4728 generic.go:334] "Generic (PLEG): container finished" podID="b159943c-5acb-49ba-951d-fb64f30525d2" containerID="f54aa5d5c22d3a9276bc6e0addcbecdf3287e642a70fb7b88f139e3be8fb7084" exitCode=0 Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.451989 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" event={"ID":"b159943c-5acb-49ba-951d-fb64f30525d2","Type":"ContainerDied","Data":"f54aa5d5c22d3a9276bc6e0addcbecdf3287e642a70fb7b88f139e3be8fb7084"} Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.453533 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7dcd7544cd-gnxgg" event={"ID":"7ac43e45-8d37-4ab4-9ebe-441421fe9044","Type":"ContainerStarted","Data":"b71cecd5e34574703f9f2a0e23fbea88d19c9cbb4430a8ccddaa75958779b719"} Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.453558 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7dcd7544cd-gnxgg" event={"ID":"7ac43e45-8d37-4ab4-9ebe-441421fe9044","Type":"ContainerStarted","Data":"028c8754bd7e9ea1ccc48333d1ecb0f28d3615bfe5cab0e6789ea8610a449d0d"} Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.458237 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86cff44659-k2jp2" event={"ID":"e3e0ec72-0e84-444e-a66f-50b4fe91adb5","Type":"ContainerStarted","Data":"91e35f387ada9a09ec13672f1c677e74c89a37f81bb1a972d67242be846dfda8"} Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.465883 4728 generic.go:334] "Generic (PLEG): container finished" podID="6964d7cb-c1bb-4296-ad17-d56280a0e8f0" containerID="bec4747c7d00dee4e3413fdb56fb8612b07fcff3c1529bc844705155109166a3" exitCode=0 Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.465964 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" event={"ID":"6964d7cb-c1bb-4296-ad17-d56280a0e8f0","Type":"ContainerDied","Data":"bec4747c7d00dee4e3413fdb56fb8612b07fcff3c1529bc844705155109166a3"} Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.465990 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" event={"ID":"6964d7cb-c1bb-4296-ad17-d56280a0e8f0","Type":"ContainerStarted","Data":"870a6d7e4e4dbe748a9091d99d14f68eacbda0ce459e2241af0044f0d5f4a6e3"} Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.476223 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bb7b6f474-4bf42" event={"ID":"8afc45c2-e8b6-4886-aa09-87ff2f284587","Type":"ContainerStarted","Data":"60706bcc752a964d0b44edd92fa8b274b4fa219e87a05030ac3b15c2907c4636"} Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.484660 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8957f9486-cds65" podStartSLOduration=7.081102911 podStartE2EDuration="29.484641814s" podCreationTimestamp="2025-12-16 15:14:48 +0000 UTC" firstStartedPulling="2025-12-16 15:14:52.175901328 +0000 UTC m=+1073.016080312" lastFinishedPulling="2025-12-16 15:15:14.579440231 +0000 UTC m=+1095.419619215" observedRunningTime="2025-12-16 15:15:17.477673719 +0000 UTC m=+1098.317852703" watchObservedRunningTime="2025-12-16 15:15:17.484641814 +0000 UTC m=+1098.324820798" Dec 16 15:15:17 crc kubenswrapper[4728]: I1216 15:15:17.500791 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=23.500772902 podStartE2EDuration="23.500772902s" podCreationTimestamp="2025-12-16 15:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:17.500056404 +0000 UTC m=+1098.340235388" watchObservedRunningTime="2025-12-16 15:15:17.500772902 +0000 UTC m=+1098.340951876" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.348034 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.489917 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9h2w\" (UniqueName: \"kubernetes.io/projected/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-kube-api-access-p9h2w\") pod \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.490442 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-run-httpd\") pod \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.490566 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-config-data\") pod \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.490657 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-combined-ca-bundle\") pod \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.490733 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-log-httpd\") pod \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.490820 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-sg-core-conf-yaml\") pod \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.490888 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-scripts\") pod \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\" (UID: \"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce\") " Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.496136 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" (UID: "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.497173 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" (UID: "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.497336 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-scripts" (OuterVolumeSpecName: "scripts") pod "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" (UID: "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.508672 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7dcd7544cd-gnxgg" event={"ID":"7ac43e45-8d37-4ab4-9ebe-441421fe9044","Type":"ContainerStarted","Data":"d3b2ed65fef43e67a637ee7e27c78fb0fc4115f9e7e41e565fd543dfffb3ca68"} Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.510499 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.510557 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.513767 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-86cff44659-k2jp2" event={"ID":"e3e0ec72-0e84-444e-a66f-50b4fe91adb5","Type":"ContainerStarted","Data":"a7e8367441c09932eb6753e94d36cb682dab4a0ab7302cca6b7b36e84108ceb9"} Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.516375 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" event={"ID":"6964d7cb-c1bb-4296-ad17-d56280a0e8f0","Type":"ContainerStarted","Data":"6fc491f73133d11fbaa46dd0217df12f4d855567ea1a4a94d59719d5897f8318"} Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.516573 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-kube-api-access-p9h2w" (OuterVolumeSpecName: "kube-api-access-p9h2w") pod "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" (UID: "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce"). InnerVolumeSpecName "kube-api-access-p9h2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.516957 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.524473 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79c9d99cd5-967vg"] Dec 16 15:15:18 crc kubenswrapper[4728]: E1216 15:15:18.525069 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" containerName="sg-core" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.525081 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" containerName="sg-core" Dec 16 15:15:18 crc kubenswrapper[4728]: E1216 15:15:18.525114 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" containerName="ceilometer-notification-agent" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.525120 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" containerName="ceilometer-notification-agent" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.525284 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" containerName="ceilometer-notification-agent" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.525309 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" containerName="sg-core" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.526165 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.533134 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.533316 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.545621 4728 generic.go:334] "Generic (PLEG): container finished" podID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" containerID="258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131" exitCode=0 Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.545712 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.545717 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce","Type":"ContainerDied","Data":"258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131"} Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.547745 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce","Type":"ContainerDied","Data":"fc6ec3c43415a7f2328272ec27591e6e1be18f0456a67f5da211ef8eac91e127"} Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.547778 4728 scope.go:117] "RemoveContainer" containerID="ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.549699 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79c9d99cd5-967vg"] Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.551624 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" (UID: "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.551736 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bb7b6f474-4bf42" event={"ID":"8afc45c2-e8b6-4886-aa09-87ff2f284587","Type":"ContainerStarted","Data":"480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9"} Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.551777 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bb7b6f474-4bf42" event={"ID":"8afc45c2-e8b6-4886-aa09-87ff2f284587","Type":"ContainerStarted","Data":"5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df"} Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.551988 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.552040 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7dcd7544cd-gnxgg" podStartSLOduration=3.552018253 podStartE2EDuration="3.552018253s" podCreationTimestamp="2025-12-16 15:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:18.544623427 +0000 UTC m=+1099.384802411" watchObservedRunningTime="2025-12-16 15:15:18.552018253 +0000 UTC m=+1099.392197237" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.557612 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" (UID: "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.580702 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-config-data" (OuterVolumeSpecName: "config-data") pod "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" (UID: "1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.592869 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9h2w\" (UniqueName: \"kubernetes.io/projected/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-kube-api-access-p9h2w\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.592901 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.592912 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.592920 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.592929 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.592939 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.592948 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.609334 4728 scope.go:117] "RemoveContainer" containerID="258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.613399 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" podStartSLOduration=2.613377593 podStartE2EDuration="2.613377593s" podCreationTimestamp="2025-12-16 15:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:18.602550485 +0000 UTC m=+1099.442729469" watchObservedRunningTime="2025-12-16 15:15:18.613377593 +0000 UTC m=+1099.453556607" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.632608 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-86cff44659-k2jp2" podStartSLOduration=8.24977339 podStartE2EDuration="30.632590443s" podCreationTimestamp="2025-12-16 15:14:48 +0000 UTC" firstStartedPulling="2025-12-16 15:14:52.254851745 +0000 UTC m=+1073.095030729" lastFinishedPulling="2025-12-16 15:15:14.637668798 +0000 UTC m=+1095.477847782" observedRunningTime="2025-12-16 15:15:18.622896945 +0000 UTC m=+1099.463075929" watchObservedRunningTime="2025-12-16 15:15:18.632590443 +0000 UTC m=+1099.472769427" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.662628 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bb7b6f474-4bf42" podStartSLOduration=2.662605301 podStartE2EDuration="2.662605301s" podCreationTimestamp="2025-12-16 15:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:18.641953952 +0000 UTC m=+1099.482132936" watchObservedRunningTime="2025-12-16 15:15:18.662605301 +0000 UTC m=+1099.502784285" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.696277 4728 scope.go:117] "RemoveContainer" containerID="ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.697818 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-ovndb-tls-certs\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.697869 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-config\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.697928 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-internal-tls-certs\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.697960 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcjmg\" (UniqueName: \"kubernetes.io/projected/fcc16f45-2441-47bf-a452-25f78e044a7e-kube-api-access-mcjmg\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.698011 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-httpd-config\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.698027 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-public-tls-certs\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.702853 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-combined-ca-bundle\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: E1216 15:15:18.705600 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2\": container with ID starting with ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2 not found: ID does not exist" containerID="ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.705647 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2"} err="failed to get container status \"ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2\": rpc error: code = NotFound desc = could not find container \"ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2\": container with ID starting with ad930ff0e4917198e478a059add7304eeeea809f96b64b56fde0f22862cf79c2 not found: ID does not exist" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.705676 4728 scope.go:117] "RemoveContainer" containerID="258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131" Dec 16 15:15:18 crc kubenswrapper[4728]: E1216 15:15:18.706664 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131\": container with ID starting with 258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131 not found: ID does not exist" containerID="258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.706699 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131"} err="failed to get container status \"258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131\": rpc error: code = NotFound desc = could not find container \"258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131\": container with ID starting with 258d0a3fe81ce6d081fe483a446254e0db012b129e4ce98ad9ec9dc061068131 not found: ID does not exist" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.806439 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcjmg\" (UniqueName: \"kubernetes.io/projected/fcc16f45-2441-47bf-a452-25f78e044a7e-kube-api-access-mcjmg\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.806543 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-httpd-config\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.806570 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-public-tls-certs\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.806639 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-combined-ca-bundle\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.806732 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-ovndb-tls-certs\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.806764 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-config\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.806806 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-internal-tls-certs\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.811912 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-internal-tls-certs\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.813225 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-ovndb-tls-certs\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.813473 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-httpd-config\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.813859 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-public-tls-certs\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.813949 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-combined-ca-bundle\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.814675 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcc16f45-2441-47bf-a452-25f78e044a7e-config\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.824472 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcjmg\" (UniqueName: \"kubernetes.io/projected/fcc16f45-2441-47bf-a452-25f78e044a7e-kube-api-access-mcjmg\") pod \"neutron-79c9d99cd5-967vg\" (UID: \"fcc16f45-2441-47bf-a452-25f78e044a7e\") " pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.865913 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:18 crc kubenswrapper[4728]: I1216 15:15:18.979186 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.097505 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.124014 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b159943c-5acb-49ba-951d-fb64f30525d2-config-volume\") pod \"b159943c-5acb-49ba-951d-fb64f30525d2\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.124190 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j6lc\" (UniqueName: \"kubernetes.io/projected/b159943c-5acb-49ba-951d-fb64f30525d2-kube-api-access-5j6lc\") pod \"b159943c-5acb-49ba-951d-fb64f30525d2\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.124253 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b159943c-5acb-49ba-951d-fb64f30525d2-secret-volume\") pod \"b159943c-5acb-49ba-951d-fb64f30525d2\" (UID: \"b159943c-5acb-49ba-951d-fb64f30525d2\") " Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.125877 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b159943c-5acb-49ba-951d-fb64f30525d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "b159943c-5acb-49ba-951d-fb64f30525d2" (UID: "b159943c-5acb-49ba-951d-fb64f30525d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.128963 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b159943c-5acb-49ba-951d-fb64f30525d2-kube-api-access-5j6lc" (OuterVolumeSpecName: "kube-api-access-5j6lc") pod "b159943c-5acb-49ba-951d-fb64f30525d2" (UID: "b159943c-5acb-49ba-951d-fb64f30525d2"). InnerVolumeSpecName "kube-api-access-5j6lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.131501 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.131843 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b159943c-5acb-49ba-951d-fb64f30525d2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b159943c-5acb-49ba-951d-fb64f30525d2" (UID: "b159943c-5acb-49ba-951d-fb64f30525d2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.191492 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:19 crc kubenswrapper[4728]: E1216 15:15:19.191854 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b159943c-5acb-49ba-951d-fb64f30525d2" containerName="collect-profiles" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.191870 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b159943c-5acb-49ba-951d-fb64f30525d2" containerName="collect-profiles" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.192011 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b159943c-5acb-49ba-951d-fb64f30525d2" containerName="collect-profiles" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.206771 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.211690 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.211883 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.226817 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.227427 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j6lc\" (UniqueName: \"kubernetes.io/projected/b159943c-5acb-49ba-951d-fb64f30525d2-kube-api-access-5j6lc\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.227453 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b159943c-5acb-49ba-951d-fb64f30525d2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.227464 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b159943c-5acb-49ba-951d-fb64f30525d2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.329267 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45vm7\" (UniqueName: \"kubernetes.io/projected/6300b826-1fb2-439e-b26f-fabdbc0aef58-kube-api-access-45vm7\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.329308 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-config-data\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.329337 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-run-httpd\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.329354 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-log-httpd\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.329607 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.329669 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.329940 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-scripts\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.428152 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79c9d99cd5-967vg"] Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.432004 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-scripts\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.432094 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45vm7\" (UniqueName: \"kubernetes.io/projected/6300b826-1fb2-439e-b26f-fabdbc0aef58-kube-api-access-45vm7\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.432142 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-config-data\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.432179 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-run-httpd\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.432203 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-log-httpd\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.432278 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.432307 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.432741 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-run-httpd\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.432826 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-log-httpd\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.436654 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-config-data\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.439842 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-scripts\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.439921 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.440265 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.449298 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45vm7\" (UniqueName: \"kubernetes.io/projected/6300b826-1fb2-439e-b26f-fabdbc0aef58-kube-api-access-45vm7\") pod \"ceilometer-0\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.533155 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce" path="/var/lib/kubelet/pods/1ac532d2-7fd7-4d34-adc4-4662d6dcf3ce/volumes" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.537481 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.566116 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.566116 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc" event={"ID":"b159943c-5acb-49ba-951d-fb64f30525d2","Type":"ContainerDied","Data":"b3817a126bb21f2cda6e98c06e6c6c18381bec0a642787fbadd5b8159ae727ef"} Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.566261 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3817a126bb21f2cda6e98c06e6c6c18381bec0a642787fbadd5b8159ae727ef" Dec 16 15:15:19 crc kubenswrapper[4728]: I1216 15:15:19.568051 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79c9d99cd5-967vg" event={"ID":"fcc16f45-2441-47bf-a452-25f78e044a7e","Type":"ContainerStarted","Data":"8fe852ba049adb5676d48f877c3bdf4a7cbdaa5bad3b7c954500d575c41b5410"} Dec 16 15:15:20 crc kubenswrapper[4728]: I1216 15:15:20.034226 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:20 crc kubenswrapper[4728]: W1216 15:15:20.878129 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6300b826_1fb2_439e_b26f_fabdbc0aef58.slice/crio-71b34b4ce1e26772c4d432b67af6701c4f3d7c5196df452a7c0a13586fe861e4 WatchSource:0}: Error finding container 71b34b4ce1e26772c4d432b67af6701c4f3d7c5196df452a7c0a13586fe861e4: Status 404 returned error can't find the container with id 71b34b4ce1e26772c4d432b67af6701c4f3d7c5196df452a7c0a13586fe861e4 Dec 16 15:15:20 crc kubenswrapper[4728]: I1216 15:15:20.940469 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 16 15:15:20 crc kubenswrapper[4728]: I1216 15:15:20.942132 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 15:15:20 crc kubenswrapper[4728]: I1216 15:15:20.945875 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 16 15:15:20 crc kubenswrapper[4728]: I1216 15:15:20.946129 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 16 15:15:20 crc kubenswrapper[4728]: I1216 15:15:20.948527 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wh27n" Dec 16 15:15:20 crc kubenswrapper[4728]: I1216 15:15:20.957334 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.063137 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09f99482-afc8-48dd-95a3-ada07d611db1-openstack-config-secret\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.063196 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09f99482-afc8-48dd-95a3-ada07d611db1-openstack-config\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.063830 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f99482-afc8-48dd-95a3-ada07d611db1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.064481 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhxxf\" (UniqueName: \"kubernetes.io/projected/09f99482-afc8-48dd-95a3-ada07d611db1-kube-api-access-bhxxf\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.166497 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f99482-afc8-48dd-95a3-ada07d611db1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.166536 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhxxf\" (UniqueName: \"kubernetes.io/projected/09f99482-afc8-48dd-95a3-ada07d611db1-kube-api-access-bhxxf\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.166622 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09f99482-afc8-48dd-95a3-ada07d611db1-openstack-config-secret\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.166650 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09f99482-afc8-48dd-95a3-ada07d611db1-openstack-config\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.167542 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09f99482-afc8-48dd-95a3-ada07d611db1-openstack-config\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.173947 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f99482-afc8-48dd-95a3-ada07d611db1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.174845 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09f99482-afc8-48dd-95a3-ada07d611db1-openstack-config-secret\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.189092 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhxxf\" (UniqueName: \"kubernetes.io/projected/09f99482-afc8-48dd-95a3-ada07d611db1-kube-api-access-bhxxf\") pod \"openstackclient\" (UID: \"09f99482-afc8-48dd-95a3-ada07d611db1\") " pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.419581 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.892971 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.921846 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79c9d99cd5-967vg" event={"ID":"fcc16f45-2441-47bf-a452-25f78e044a7e","Type":"ContainerStarted","Data":"a1335b416146d258d4689f114df5a9fe20966da6c2513e94129f030affa5ee7a"} Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.922133 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.922148 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79c9d99cd5-967vg" event={"ID":"fcc16f45-2441-47bf-a452-25f78e044a7e","Type":"ContainerStarted","Data":"17f3d4b08fa9e6833a16b6299c071726b992679f68d8aae99f40da3e238bd1f0"} Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.922854 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6300b826-1fb2-439e-b26f-fabdbc0aef58","Type":"ContainerStarted","Data":"71b34b4ce1e26772c4d432b67af6701c4f3d7c5196df452a7c0a13586fe861e4"} Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.923582 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"09f99482-afc8-48dd-95a3-ada07d611db1","Type":"ContainerStarted","Data":"e0764e8cb163a308b3aea5a8dbc5a3fbff67d1c912acf6d53ec9bd9023f6d68d"} Dec 16 15:15:21 crc kubenswrapper[4728]: I1216 15:15:21.944784 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79c9d99cd5-967vg" podStartSLOduration=3.944766542 podStartE2EDuration="3.944766542s" podCreationTimestamp="2025-12-16 15:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:21.942622083 +0000 UTC m=+1102.782801067" watchObservedRunningTime="2025-12-16 15:15:21.944766542 +0000 UTC m=+1102.784945526" Dec 16 15:15:22 crc kubenswrapper[4728]: I1216 15:15:22.934553 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6300b826-1fb2-439e-b26f-fabdbc0aef58","Type":"ContainerStarted","Data":"8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7"} Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.525477 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-589dd4bc84-6zndr" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.850493 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-54bb7475-hxsvl"] Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.852324 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.857510 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.857689 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.858118 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.878385 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-54bb7475-hxsvl"] Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.918273 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l8xx\" (UniqueName: \"kubernetes.io/projected/b5b59721-592a-4649-8246-0487a18177b9-kube-api-access-5l8xx\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.918376 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-public-tls-certs\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.918687 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b59721-592a-4649-8246-0487a18177b9-etc-swift\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.918945 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-combined-ca-bundle\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.919002 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-config-data\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.919030 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5b59721-592a-4649-8246-0487a18177b9-run-httpd\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.919082 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5b59721-592a-4649-8246-0487a18177b9-log-httpd\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.919110 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-internal-tls-certs\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:23 crc kubenswrapper[4728]: I1216 15:15:23.943205 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6300b826-1fb2-439e-b26f-fabdbc0aef58","Type":"ContainerStarted","Data":"9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc"} Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.021137 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l8xx\" (UniqueName: \"kubernetes.io/projected/b5b59721-592a-4649-8246-0487a18177b9-kube-api-access-5l8xx\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.021177 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-public-tls-certs\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.021250 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b59721-592a-4649-8246-0487a18177b9-etc-swift\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.021272 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-combined-ca-bundle\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.021298 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-config-data\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.021315 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5b59721-592a-4649-8246-0487a18177b9-run-httpd\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.021341 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5b59721-592a-4649-8246-0487a18177b9-log-httpd\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.021360 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-internal-tls-certs\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.023827 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5b59721-592a-4649-8246-0487a18177b9-log-httpd\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.024775 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b5b59721-592a-4649-8246-0487a18177b9-run-httpd\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.027432 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-config-data\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.029493 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-public-tls-certs\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.030101 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-internal-tls-certs\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.030900 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b59721-592a-4649-8246-0487a18177b9-combined-ca-bundle\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.039376 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b5b59721-592a-4649-8246-0487a18177b9-etc-swift\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.044120 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l8xx\" (UniqueName: \"kubernetes.io/projected/b5b59721-592a-4649-8246-0487a18177b9-kube-api-access-5l8xx\") pod \"swift-proxy-54bb7475-hxsvl\" (UID: \"b5b59721-592a-4649-8246-0487a18177b9\") " pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.175535 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.843917 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.844369 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.844385 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.844398 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.864202 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-54bb7475-hxsvl"] Dec 16 15:15:24 crc kubenswrapper[4728]: W1216 15:15:24.864497 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5b59721_592a_4649_8246_0487a18177b9.slice/crio-b98cbd13cce89d5d14a6c6691eed8af92d67083f104b34d965e8ced2bf18a004 WatchSource:0}: Error finding container b98cbd13cce89d5d14a6c6691eed8af92d67083f104b34d965e8ced2bf18a004: Status 404 returned error can't find the container with id b98cbd13cce89d5d14a6c6691eed8af92d67083f104b34d965e8ced2bf18a004 Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.910886 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.925945 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:24 crc kubenswrapper[4728]: I1216 15:15:24.951934 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54bb7475-hxsvl" event={"ID":"b5b59721-592a-4649-8246-0487a18177b9","Type":"ContainerStarted","Data":"b98cbd13cce89d5d14a6c6691eed8af92d67083f104b34d965e8ced2bf18a004"} Dec 16 15:15:26 crc kubenswrapper[4728]: I1216 15:15:26.406383 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:15:26 crc kubenswrapper[4728]: I1216 15:15:26.469847 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7f56"] Dec 16 15:15:26 crc kubenswrapper[4728]: I1216 15:15:26.470080 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" podUID="c659dc6b-019b-4cc8-81c5-2a7732c684c6" containerName="dnsmasq-dns" containerID="cri-o://0574e902672c8b10da2dda38fe7bb1222ec789e485bd15ed54f7030f6919a931" gracePeriod=10 Dec 16 15:15:26 crc kubenswrapper[4728]: I1216 15:15:26.868791 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:27 crc kubenswrapper[4728]: E1216 15:15:27.295131 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-9d9zb" podUID="f82109b1-c2b6-462c-8857-d0d8b243f64a" Dec 16 15:15:27 crc kubenswrapper[4728]: I1216 15:15:27.427084 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:27 crc kubenswrapper[4728]: I1216 15:15:27.427174 4728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 15:15:27 crc kubenswrapper[4728]: I1216 15:15:27.430212 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 15:15:27 crc kubenswrapper[4728]: I1216 15:15:27.980311 4728 generic.go:334] "Generic (PLEG): container finished" podID="c659dc6b-019b-4cc8-81c5-2a7732c684c6" containerID="0574e902672c8b10da2dda38fe7bb1222ec789e485bd15ed54f7030f6919a931" exitCode=0 Dec 16 15:15:27 crc kubenswrapper[4728]: I1216 15:15:27.980462 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" event={"ID":"c659dc6b-019b-4cc8-81c5-2a7732c684c6","Type":"ContainerDied","Data":"0574e902672c8b10da2dda38fe7bb1222ec789e485bd15ed54f7030f6919a931"} Dec 16 15:15:28 crc kubenswrapper[4728]: I1216 15:15:28.989474 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54bb7475-hxsvl" event={"ID":"b5b59721-592a-4649-8246-0487a18177b9","Type":"ContainerStarted","Data":"8a98dc2c964487a5679b97ca6184186207a6e63e76b99c24c4e4d11b4c070a55"} Dec 16 15:15:33 crc kubenswrapper[4728]: I1216 15:15:33.525611 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-589dd4bc84-6zndr" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 16 15:15:33 crc kubenswrapper[4728]: I1216 15:15:33.526283 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:15:33 crc kubenswrapper[4728]: I1216 15:15:33.591644 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" podUID="c659dc6b-019b-4cc8-81c5-2a7732c684c6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: i/o timeout" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.295850 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.335920 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-sb\") pod \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.336058 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-svc\") pod \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.336080 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-swift-storage-0\") pod \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.336123 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-config\") pod \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.336145 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-252zz\" (UniqueName: \"kubernetes.io/projected/c659dc6b-019b-4cc8-81c5-2a7732c684c6-kube-api-access-252zz\") pod \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.336188 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-nb\") pod \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\" (UID: \"c659dc6b-019b-4cc8-81c5-2a7732c684c6\") " Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.351189 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c659dc6b-019b-4cc8-81c5-2a7732c684c6-kube-api-access-252zz" (OuterVolumeSpecName: "kube-api-access-252zz") pod "c659dc6b-019b-4cc8-81c5-2a7732c684c6" (UID: "c659dc6b-019b-4cc8-81c5-2a7732c684c6"). InnerVolumeSpecName "kube-api-access-252zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.438021 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-252zz\" (UniqueName: \"kubernetes.io/projected/c659dc6b-019b-4cc8-81c5-2a7732c684c6-kube-api-access-252zz\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.499466 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-config" (OuterVolumeSpecName: "config") pod "c659dc6b-019b-4cc8-81c5-2a7732c684c6" (UID: "c659dc6b-019b-4cc8-81c5-2a7732c684c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.507739 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c659dc6b-019b-4cc8-81c5-2a7732c684c6" (UID: "c659dc6b-019b-4cc8-81c5-2a7732c684c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.511849 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c659dc6b-019b-4cc8-81c5-2a7732c684c6" (UID: "c659dc6b-019b-4cc8-81c5-2a7732c684c6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.521087 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c659dc6b-019b-4cc8-81c5-2a7732c684c6" (UID: "c659dc6b-019b-4cc8-81c5-2a7732c684c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.535880 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c659dc6b-019b-4cc8-81c5-2a7732c684c6" (UID: "c659dc6b-019b-4cc8-81c5-2a7732c684c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.539590 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.539619 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.539629 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.539638 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:34 crc kubenswrapper[4728]: I1216 15:15:34.539646 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c659dc6b-019b-4cc8-81c5-2a7732c684c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.066566 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-54bb7475-hxsvl" event={"ID":"b5b59721-592a-4649-8246-0487a18177b9","Type":"ContainerStarted","Data":"da745500baba7be253f4c49394a676e54f6070b88ceb87577a6150c7807e695d"} Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.066994 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.069931 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6300b826-1fb2-439e-b26f-fabdbc0aef58","Type":"ContainerStarted","Data":"056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc"} Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.071627 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"09f99482-afc8-48dd-95a3-ada07d611db1","Type":"ContainerStarted","Data":"7ca551a2707d691c6c83f41c7c96b2be0aec1117e1e31c65a62af055ab44c4d3"} Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.077636 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" event={"ID":"c659dc6b-019b-4cc8-81c5-2a7732c684c6","Type":"ContainerDied","Data":"1067232c91febf7b86f917d6f8ff86b842cfd6e5f45a0a5eb4ffbce73245e8f1"} Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.077684 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.077699 4728 scope.go:117] "RemoveContainer" containerID="0574e902672c8b10da2dda38fe7bb1222ec789e485bd15ed54f7030f6919a931" Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.095629 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-54bb7475-hxsvl" podStartSLOduration=12.095612528 podStartE2EDuration="12.095612528s" podCreationTimestamp="2025-12-16 15:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:35.093768098 +0000 UTC m=+1115.933947112" watchObservedRunningTime="2025-12-16 15:15:35.095612528 +0000 UTC m=+1115.935791512" Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.110209 4728 scope.go:117] "RemoveContainer" containerID="acd31ec7169929cc4145fca96653ce713cf810c753c49c9fe28898aba436f16f" Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.142182 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.8937588 podStartE2EDuration="15.142158632s" podCreationTimestamp="2025-12-16 15:15:20 +0000 UTC" firstStartedPulling="2025-12-16 15:15:21.910705346 +0000 UTC m=+1102.750884330" lastFinishedPulling="2025-12-16 15:15:34.159105178 +0000 UTC m=+1114.999284162" observedRunningTime="2025-12-16 15:15:35.128147322 +0000 UTC m=+1115.968326306" watchObservedRunningTime="2025-12-16 15:15:35.142158632 +0000 UTC m=+1115.982337616" Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.167114 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7f56"] Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.185240 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-s7f56"] Dec 16 15:15:35 crc kubenswrapper[4728]: I1216 15:15:35.526306 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c659dc6b-019b-4cc8-81c5-2a7732c684c6" path="/var/lib/kubelet/pods/c659dc6b-019b-4cc8-81c5-2a7732c684c6/volumes" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.089841 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.107792 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.754087 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-gbntn"] Dec 16 15:15:36 crc kubenswrapper[4728]: E1216 15:15:36.754724 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c659dc6b-019b-4cc8-81c5-2a7732c684c6" containerName="init" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.754740 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c659dc6b-019b-4cc8-81c5-2a7732c684c6" containerName="init" Dec 16 15:15:36 crc kubenswrapper[4728]: E1216 15:15:36.754754 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c659dc6b-019b-4cc8-81c5-2a7732c684c6" containerName="dnsmasq-dns" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.754762 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c659dc6b-019b-4cc8-81c5-2a7732c684c6" containerName="dnsmasq-dns" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.754933 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c659dc6b-019b-4cc8-81c5-2a7732c684c6" containerName="dnsmasq-dns" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.755552 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gbntn" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.785857 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/887aeac4-be08-4765-ac4c-1f2ac326d1a3-operator-scripts\") pod \"nova-api-db-create-gbntn\" (UID: \"887aeac4-be08-4765-ac4c-1f2ac326d1a3\") " pod="openstack/nova-api-db-create-gbntn" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.785947 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tclcp\" (UniqueName: \"kubernetes.io/projected/887aeac4-be08-4765-ac4c-1f2ac326d1a3-kube-api-access-tclcp\") pod \"nova-api-db-create-gbntn\" (UID: \"887aeac4-be08-4765-ac4c-1f2ac326d1a3\") " pod="openstack/nova-api-db-create-gbntn" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.789537 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gbntn"] Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.856114 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-swl7z"] Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.857190 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-swl7z" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.865595 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-swl7z"] Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.887034 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c269bf0a-1104-49e8-8c99-9ce6926f55c2-operator-scripts\") pod \"nova-cell0-db-create-swl7z\" (UID: \"c269bf0a-1104-49e8-8c99-9ce6926f55c2\") " pod="openstack/nova-cell0-db-create-swl7z" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.887091 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tclcp\" (UniqueName: \"kubernetes.io/projected/887aeac4-be08-4765-ac4c-1f2ac326d1a3-kube-api-access-tclcp\") pod \"nova-api-db-create-gbntn\" (UID: \"887aeac4-be08-4765-ac4c-1f2ac326d1a3\") " pod="openstack/nova-api-db-create-gbntn" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.887114 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqqw7\" (UniqueName: \"kubernetes.io/projected/c269bf0a-1104-49e8-8c99-9ce6926f55c2-kube-api-access-gqqw7\") pod \"nova-cell0-db-create-swl7z\" (UID: \"c269bf0a-1104-49e8-8c99-9ce6926f55c2\") " pod="openstack/nova-cell0-db-create-swl7z" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.887510 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/887aeac4-be08-4765-ac4c-1f2ac326d1a3-operator-scripts\") pod \"nova-api-db-create-gbntn\" (UID: \"887aeac4-be08-4765-ac4c-1f2ac326d1a3\") " pod="openstack/nova-api-db-create-gbntn" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.888357 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/887aeac4-be08-4765-ac4c-1f2ac326d1a3-operator-scripts\") pod \"nova-api-db-create-gbntn\" (UID: \"887aeac4-be08-4765-ac4c-1f2ac326d1a3\") " pod="openstack/nova-api-db-create-gbntn" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.902954 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tclcp\" (UniqueName: \"kubernetes.io/projected/887aeac4-be08-4765-ac4c-1f2ac326d1a3-kube-api-access-tclcp\") pod \"nova-api-db-create-gbntn\" (UID: \"887aeac4-be08-4765-ac4c-1f2ac326d1a3\") " pod="openstack/nova-api-db-create-gbntn" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.975119 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-km67h"] Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.976758 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-km67h" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.985132 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6ee5-account-create-update-6k7r6"] Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.986271 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6ee5-account-create-update-6k7r6" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.988443 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.988871 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c269bf0a-1104-49e8-8c99-9ce6926f55c2-operator-scripts\") pod \"nova-cell0-db-create-swl7z\" (UID: \"c269bf0a-1104-49e8-8c99-9ce6926f55c2\") " pod="openstack/nova-cell0-db-create-swl7z" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.988925 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqqw7\" (UniqueName: \"kubernetes.io/projected/c269bf0a-1104-49e8-8c99-9ce6926f55c2-kube-api-access-gqqw7\") pod \"nova-cell0-db-create-swl7z\" (UID: \"c269bf0a-1104-49e8-8c99-9ce6926f55c2\") " pod="openstack/nova-cell0-db-create-swl7z" Dec 16 15:15:36 crc kubenswrapper[4728]: I1216 15:15:36.989508 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c269bf0a-1104-49e8-8c99-9ce6926f55c2-operator-scripts\") pod \"nova-cell0-db-create-swl7z\" (UID: \"c269bf0a-1104-49e8-8c99-9ce6926f55c2\") " pod="openstack/nova-cell0-db-create-swl7z" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.005517 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-km67h"] Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.010912 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqqw7\" (UniqueName: \"kubernetes.io/projected/c269bf0a-1104-49e8-8c99-9ce6926f55c2-kube-api-access-gqqw7\") pod \"nova-cell0-db-create-swl7z\" (UID: \"c269bf0a-1104-49e8-8c99-9ce6926f55c2\") " pod="openstack/nova-cell0-db-create-swl7z" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.012120 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6ee5-account-create-update-6k7r6"] Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.085571 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gbntn" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.090427 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgzpq\" (UniqueName: \"kubernetes.io/projected/381f4a37-75d0-4da2-a183-875b9bc481aa-kube-api-access-lgzpq\") pod \"nova-cell1-db-create-km67h\" (UID: \"381f4a37-75d0-4da2-a183-875b9bc481aa\") " pod="openstack/nova-cell1-db-create-km67h" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.090510 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-operator-scripts\") pod \"nova-api-6ee5-account-create-update-6k7r6\" (UID: \"2be26ac5-6df6-4245-abaa-07c0e6fcdffd\") " pod="openstack/nova-api-6ee5-account-create-update-6k7r6" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.090530 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/381f4a37-75d0-4da2-a183-875b9bc481aa-operator-scripts\") pod \"nova-cell1-db-create-km67h\" (UID: \"381f4a37-75d0-4da2-a183-875b9bc481aa\") " pod="openstack/nova-cell1-db-create-km67h" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.090566 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9m8j\" (UniqueName: \"kubernetes.io/projected/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-kube-api-access-t9m8j\") pod \"nova-api-6ee5-account-create-update-6k7r6\" (UID: \"2be26ac5-6df6-4245-abaa-07c0e6fcdffd\") " pod="openstack/nova-api-6ee5-account-create-update-6k7r6" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.105589 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6300b826-1fb2-439e-b26f-fabdbc0aef58","Type":"ContainerStarted","Data":"912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75"} Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.105688 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="ceilometer-central-agent" containerID="cri-o://8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7" gracePeriod=30 Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.105742 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.105836 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="proxy-httpd" containerID="cri-o://912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75" gracePeriod=30 Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.105888 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="sg-core" containerID="cri-o://056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc" gracePeriod=30 Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.105928 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="ceilometer-notification-agent" containerID="cri-o://9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc" gracePeriod=30 Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.134953 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6447754789999998 podStartE2EDuration="18.134938197s" podCreationTimestamp="2025-12-16 15:15:19 +0000 UTC" firstStartedPulling="2025-12-16 15:15:20.879673316 +0000 UTC m=+1101.719852290" lastFinishedPulling="2025-12-16 15:15:36.369836014 +0000 UTC m=+1117.210015008" observedRunningTime="2025-12-16 15:15:37.133027065 +0000 UTC m=+1117.973206049" watchObservedRunningTime="2025-12-16 15:15:37.134938197 +0000 UTC m=+1117.975117181" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.178232 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7b52-account-create-update-q2f6l"] Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.179808 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.182284 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.192023 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-operator-scripts\") pod \"nova-api-6ee5-account-create-update-6k7r6\" (UID: \"2be26ac5-6df6-4245-abaa-07c0e6fcdffd\") " pod="openstack/nova-api-6ee5-account-create-update-6k7r6" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.192077 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/381f4a37-75d0-4da2-a183-875b9bc481aa-operator-scripts\") pod \"nova-cell1-db-create-km67h\" (UID: \"381f4a37-75d0-4da2-a183-875b9bc481aa\") " pod="openstack/nova-cell1-db-create-km67h" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.192135 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9m8j\" (UniqueName: \"kubernetes.io/projected/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-kube-api-access-t9m8j\") pod \"nova-api-6ee5-account-create-update-6k7r6\" (UID: \"2be26ac5-6df6-4245-abaa-07c0e6fcdffd\") " pod="openstack/nova-api-6ee5-account-create-update-6k7r6" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.192279 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgzpq\" (UniqueName: \"kubernetes.io/projected/381f4a37-75d0-4da2-a183-875b9bc481aa-kube-api-access-lgzpq\") pod \"nova-cell1-db-create-km67h\" (UID: \"381f4a37-75d0-4da2-a183-875b9bc481aa\") " pod="openstack/nova-cell1-db-create-km67h" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.193436 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-operator-scripts\") pod \"nova-api-6ee5-account-create-update-6k7r6\" (UID: \"2be26ac5-6df6-4245-abaa-07c0e6fcdffd\") " pod="openstack/nova-api-6ee5-account-create-update-6k7r6" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.194433 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/381f4a37-75d0-4da2-a183-875b9bc481aa-operator-scripts\") pod \"nova-cell1-db-create-km67h\" (UID: \"381f4a37-75d0-4da2-a183-875b9bc481aa\") " pod="openstack/nova-cell1-db-create-km67h" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.199870 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7b52-account-create-update-q2f6l"] Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.212766 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgzpq\" (UniqueName: \"kubernetes.io/projected/381f4a37-75d0-4da2-a183-875b9bc481aa-kube-api-access-lgzpq\") pod \"nova-cell1-db-create-km67h\" (UID: \"381f4a37-75d0-4da2-a183-875b9bc481aa\") " pod="openstack/nova-cell1-db-create-km67h" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.222888 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9m8j\" (UniqueName: \"kubernetes.io/projected/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-kube-api-access-t9m8j\") pod \"nova-api-6ee5-account-create-update-6k7r6\" (UID: \"2be26ac5-6df6-4245-abaa-07c0e6fcdffd\") " pod="openstack/nova-api-6ee5-account-create-update-6k7r6" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.273895 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-swl7z" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.294158 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrwl8\" (UniqueName: \"kubernetes.io/projected/56aab1b9-1cbf-4647-b025-581f674334d6-kube-api-access-xrwl8\") pod \"nova-cell0-7b52-account-create-update-q2f6l\" (UID: \"56aab1b9-1cbf-4647-b025-581f674334d6\") " pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.294417 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56aab1b9-1cbf-4647-b025-581f674334d6-operator-scripts\") pod \"nova-cell0-7b52-account-create-update-q2f6l\" (UID: \"56aab1b9-1cbf-4647-b025-581f674334d6\") " pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.302657 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-km67h" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.350558 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6ee5-account-create-update-6k7r6" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.395523 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2e16-account-create-update-stnh7"] Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.396567 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56aab1b9-1cbf-4647-b025-581f674334d6-operator-scripts\") pod \"nova-cell0-7b52-account-create-update-q2f6l\" (UID: \"56aab1b9-1cbf-4647-b025-581f674334d6\") " pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.396677 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrwl8\" (UniqueName: \"kubernetes.io/projected/56aab1b9-1cbf-4647-b025-581f674334d6-kube-api-access-xrwl8\") pod \"nova-cell0-7b52-account-create-update-q2f6l\" (UID: \"56aab1b9-1cbf-4647-b025-581f674334d6\") " pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.396684 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2e16-account-create-update-stnh7" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.399096 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.405597 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56aab1b9-1cbf-4647-b025-581f674334d6-operator-scripts\") pod \"nova-cell0-7b52-account-create-update-q2f6l\" (UID: \"56aab1b9-1cbf-4647-b025-581f674334d6\") " pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.421296 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2e16-account-create-update-stnh7"] Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.422469 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrwl8\" (UniqueName: \"kubernetes.io/projected/56aab1b9-1cbf-4647-b025-581f674334d6-kube-api-access-xrwl8\") pod \"nova-cell0-7b52-account-create-update-q2f6l\" (UID: \"56aab1b9-1cbf-4647-b025-581f674334d6\") " pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.500847 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddkdp\" (UniqueName: \"kubernetes.io/projected/296e275b-fc1b-4946-a4f2-2d61fac9aff8-kube-api-access-ddkdp\") pod \"nova-cell1-2e16-account-create-update-stnh7\" (UID: \"296e275b-fc1b-4946-a4f2-2d61fac9aff8\") " pod="openstack/nova-cell1-2e16-account-create-update-stnh7" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.501299 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/296e275b-fc1b-4946-a4f2-2d61fac9aff8-operator-scripts\") pod \"nova-cell1-2e16-account-create-update-stnh7\" (UID: \"296e275b-fc1b-4946-a4f2-2d61fac9aff8\") " pod="openstack/nova-cell1-2e16-account-create-update-stnh7" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.501583 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.560945 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gbntn"] Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.602685 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/296e275b-fc1b-4946-a4f2-2d61fac9aff8-operator-scripts\") pod \"nova-cell1-2e16-account-create-update-stnh7\" (UID: \"296e275b-fc1b-4946-a4f2-2d61fac9aff8\") " pod="openstack/nova-cell1-2e16-account-create-update-stnh7" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.602877 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddkdp\" (UniqueName: \"kubernetes.io/projected/296e275b-fc1b-4946-a4f2-2d61fac9aff8-kube-api-access-ddkdp\") pod \"nova-cell1-2e16-account-create-update-stnh7\" (UID: \"296e275b-fc1b-4946-a4f2-2d61fac9aff8\") " pod="openstack/nova-cell1-2e16-account-create-update-stnh7" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.604516 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/296e275b-fc1b-4946-a4f2-2d61fac9aff8-operator-scripts\") pod \"nova-cell1-2e16-account-create-update-stnh7\" (UID: \"296e275b-fc1b-4946-a4f2-2d61fac9aff8\") " pod="openstack/nova-cell1-2e16-account-create-update-stnh7" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.625342 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddkdp\" (UniqueName: \"kubernetes.io/projected/296e275b-fc1b-4946-a4f2-2d61fac9aff8-kube-api-access-ddkdp\") pod \"nova-cell1-2e16-account-create-update-stnh7\" (UID: \"296e275b-fc1b-4946-a4f2-2d61fac9aff8\") " pod="openstack/nova-cell1-2e16-account-create-update-stnh7" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.655236 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2e16-account-create-update-stnh7" Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.773331 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-swl7z"] Dec 16 15:15:37 crc kubenswrapper[4728]: W1216 15:15:37.778770 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc269bf0a_1104_49e8_8c99_9ce6926f55c2.slice/crio-13ddd0db5dbdc491cded27b8cb7a39a22ca7a4fe8775e15d2a3efc647d0f1303 WatchSource:0}: Error finding container 13ddd0db5dbdc491cded27b8cb7a39a22ca7a4fe8775e15d2a3efc647d0f1303: Status 404 returned error can't find the container with id 13ddd0db5dbdc491cded27b8cb7a39a22ca7a4fe8775e15d2a3efc647d0f1303 Dec 16 15:15:37 crc kubenswrapper[4728]: W1216 15:15:37.920576 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod381f4a37_75d0_4da2_a183_875b9bc481aa.slice/crio-2ea93755c64e305d3ac4cac5d841b6fc91af530a710c2dddd686349bde25a732 WatchSource:0}: Error finding container 2ea93755c64e305d3ac4cac5d841b6fc91af530a710c2dddd686349bde25a732: Status 404 returned error can't find the container with id 2ea93755c64e305d3ac4cac5d841b6fc91af530a710c2dddd686349bde25a732 Dec 16 15:15:37 crc kubenswrapper[4728]: I1216 15:15:37.927358 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-km67h"] Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.033498 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6ee5-account-create-update-6k7r6"] Dec 16 15:15:38 crc kubenswrapper[4728]: W1216 15:15:38.046270 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2be26ac5_6df6_4245_abaa_07c0e6fcdffd.slice/crio-9861c398347e2d2089a1ed96c0912175d56df6a482911aeb5afa886230743dac WatchSource:0}: Error finding container 9861c398347e2d2089a1ed96c0912175d56df6a482911aeb5afa886230743dac: Status 404 returned error can't find the container with id 9861c398347e2d2089a1ed96c0912175d56df6a482911aeb5afa886230743dac Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.119448 4728 generic.go:334] "Generic (PLEG): container finished" podID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerID="912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75" exitCode=0 Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.119496 4728 generic.go:334] "Generic (PLEG): container finished" podID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerID="056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc" exitCode=2 Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.119506 4728 generic.go:334] "Generic (PLEG): container finished" podID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerID="8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7" exitCode=0 Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.119523 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6300b826-1fb2-439e-b26f-fabdbc0aef58","Type":"ContainerDied","Data":"912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75"} Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.119578 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6300b826-1fb2-439e-b26f-fabdbc0aef58","Type":"ContainerDied","Data":"056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc"} Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.119591 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6300b826-1fb2-439e-b26f-fabdbc0aef58","Type":"ContainerDied","Data":"8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7"} Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.121902 4728 generic.go:334] "Generic (PLEG): container finished" podID="887aeac4-be08-4765-ac4c-1f2ac326d1a3" containerID="73759ef8f04f1a70c0514d308b163d73f97a9db2a7bbab324d7b4170ccd16066" exitCode=0 Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.121947 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gbntn" event={"ID":"887aeac4-be08-4765-ac4c-1f2ac326d1a3","Type":"ContainerDied","Data":"73759ef8f04f1a70c0514d308b163d73f97a9db2a7bbab324d7b4170ccd16066"} Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.121974 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gbntn" event={"ID":"887aeac4-be08-4765-ac4c-1f2ac326d1a3","Type":"ContainerStarted","Data":"eb2846a2e35474cc404c9bd70d04ee834d35f218d6714b84a231b8af8f0f2a55"} Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.124850 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6ee5-account-create-update-6k7r6" event={"ID":"2be26ac5-6df6-4245-abaa-07c0e6fcdffd","Type":"ContainerStarted","Data":"9861c398347e2d2089a1ed96c0912175d56df6a482911aeb5afa886230743dac"} Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.126716 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-km67h" event={"ID":"381f4a37-75d0-4da2-a183-875b9bc481aa","Type":"ContainerStarted","Data":"5f5a9d87a3391e72a4b4578e9e8a1992ea4c29ea1384a7fc382dda449a092783"} Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.126744 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-km67h" event={"ID":"381f4a37-75d0-4da2-a183-875b9bc481aa","Type":"ContainerStarted","Data":"2ea93755c64e305d3ac4cac5d841b6fc91af530a710c2dddd686349bde25a732"} Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.128456 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-swl7z" event={"ID":"c269bf0a-1104-49e8-8c99-9ce6926f55c2","Type":"ContainerStarted","Data":"5a15a49a8830f47089fe02053586129c89b224a320417de2d07ef1773e3b146f"} Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.128487 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-swl7z" event={"ID":"c269bf0a-1104-49e8-8c99-9ce6926f55c2","Type":"ContainerStarted","Data":"13ddd0db5dbdc491cded27b8cb7a39a22ca7a4fe8775e15d2a3efc647d0f1303"} Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.150236 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7b52-account-create-update-q2f6l"] Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.249281 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-swl7z" podStartSLOduration=2.249264158 podStartE2EDuration="2.249264158s" podCreationTimestamp="2025-12-16 15:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:38.205664024 +0000 UTC m=+1119.045843008" watchObservedRunningTime="2025-12-16 15:15:38.249264158 +0000 UTC m=+1119.089443142" Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.256617 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-km67h" podStartSLOduration=2.256603898 podStartE2EDuration="2.256603898s" podCreationTimestamp="2025-12-16 15:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:38.248046016 +0000 UTC m=+1119.088225000" watchObservedRunningTime="2025-12-16 15:15:38.256603898 +0000 UTC m=+1119.096782882" Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.340287 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2e16-account-create-update-stnh7"] Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.593096 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-586bdc5f9-s7f56" podUID="c659dc6b-019b-4cc8-81c5-2a7732c684c6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: i/o timeout" Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.819483 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.929159 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33646e3-23f5-40a1-88ef-f55bdd5a230c-logs\") pod \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.929268 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-secret-key\") pod \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.929350 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-combined-ca-bundle\") pod \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.929379 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-scripts\") pod \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.929440 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-config-data\") pod \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.929460 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tz9g\" (UniqueName: \"kubernetes.io/projected/f33646e3-23f5-40a1-88ef-f55bdd5a230c-kube-api-access-4tz9g\") pod \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.929477 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-tls-certs\") pod \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\" (UID: \"f33646e3-23f5-40a1-88ef-f55bdd5a230c\") " Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.931238 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f33646e3-23f5-40a1-88ef-f55bdd5a230c-logs" (OuterVolumeSpecName: "logs") pod "f33646e3-23f5-40a1-88ef-f55bdd5a230c" (UID: "f33646e3-23f5-40a1-88ef-f55bdd5a230c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.936152 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f33646e3-23f5-40a1-88ef-f55bdd5a230c-kube-api-access-4tz9g" (OuterVolumeSpecName: "kube-api-access-4tz9g") pod "f33646e3-23f5-40a1-88ef-f55bdd5a230c" (UID: "f33646e3-23f5-40a1-88ef-f55bdd5a230c"). InnerVolumeSpecName "kube-api-access-4tz9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.939585 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f33646e3-23f5-40a1-88ef-f55bdd5a230c" (UID: "f33646e3-23f5-40a1-88ef-f55bdd5a230c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.963059 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-config-data" (OuterVolumeSpecName: "config-data") pod "f33646e3-23f5-40a1-88ef-f55bdd5a230c" (UID: "f33646e3-23f5-40a1-88ef-f55bdd5a230c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.971897 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-scripts" (OuterVolumeSpecName: "scripts") pod "f33646e3-23f5-40a1-88ef-f55bdd5a230c" (UID: "f33646e3-23f5-40a1-88ef-f55bdd5a230c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:38 crc kubenswrapper[4728]: I1216 15:15:38.979256 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f33646e3-23f5-40a1-88ef-f55bdd5a230c" (UID: "f33646e3-23f5-40a1-88ef-f55bdd5a230c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.000081 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f33646e3-23f5-40a1-88ef-f55bdd5a230c" (UID: "f33646e3-23f5-40a1-88ef-f55bdd5a230c"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.032087 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33646e3-23f5-40a1-88ef-f55bdd5a230c-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.032133 4728 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.032151 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.032165 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.032182 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f33646e3-23f5-40a1-88ef-f55bdd5a230c-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.032196 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tz9g\" (UniqueName: \"kubernetes.io/projected/f33646e3-23f5-40a1-88ef-f55bdd5a230c-kube-api-access-4tz9g\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.032210 4728 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33646e3-23f5-40a1-88ef-f55bdd5a230c-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.138214 4728 generic.go:334] "Generic (PLEG): container finished" podID="2be26ac5-6df6-4245-abaa-07c0e6fcdffd" containerID="8b33f1d15420a129b4b38864e51d2bc248a9beb033e87b07e04a0a535e65877a" exitCode=0 Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.139273 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6ee5-account-create-update-6k7r6" event={"ID":"2be26ac5-6df6-4245-abaa-07c0e6fcdffd","Type":"ContainerDied","Data":"8b33f1d15420a129b4b38864e51d2bc248a9beb033e87b07e04a0a535e65877a"} Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.139889 4728 generic.go:334] "Generic (PLEG): container finished" podID="c269bf0a-1104-49e8-8c99-9ce6926f55c2" containerID="5a15a49a8830f47089fe02053586129c89b224a320417de2d07ef1773e3b146f" exitCode=0 Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.139917 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-swl7z" event={"ID":"c269bf0a-1104-49e8-8c99-9ce6926f55c2","Type":"ContainerDied","Data":"5a15a49a8830f47089fe02053586129c89b224a320417de2d07ef1773e3b146f"} Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.142891 4728 generic.go:334] "Generic (PLEG): container finished" podID="381f4a37-75d0-4da2-a183-875b9bc481aa" containerID="5f5a9d87a3391e72a4b4578e9e8a1992ea4c29ea1384a7fc382dda449a092783" exitCode=0 Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.142942 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-km67h" event={"ID":"381f4a37-75d0-4da2-a183-875b9bc481aa","Type":"ContainerDied","Data":"5f5a9d87a3391e72a4b4578e9e8a1992ea4c29ea1384a7fc382dda449a092783"} Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.144827 4728 generic.go:334] "Generic (PLEG): container finished" podID="56aab1b9-1cbf-4647-b025-581f674334d6" containerID="a391b84ed8b4e1ddbf4cb68c6096ee3c1ea90f9c71f3f101d22e807bbe259f2c" exitCode=0 Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.144908 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" event={"ID":"56aab1b9-1cbf-4647-b025-581f674334d6","Type":"ContainerDied","Data":"a391b84ed8b4e1ddbf4cb68c6096ee3c1ea90f9c71f3f101d22e807bbe259f2c"} Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.145032 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" event={"ID":"56aab1b9-1cbf-4647-b025-581f674334d6","Type":"ContainerStarted","Data":"995de42443cf1285fa2d0cc3a517837f65b82516220289cb95cd69aab842232f"} Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.147123 4728 generic.go:334] "Generic (PLEG): container finished" podID="296e275b-fc1b-4946-a4f2-2d61fac9aff8" containerID="b0337e79c24fb11aa755a2a464f61de22b18cda479f725824638dae8ee5c82f7" exitCode=0 Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.147176 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2e16-account-create-update-stnh7" event={"ID":"296e275b-fc1b-4946-a4f2-2d61fac9aff8","Type":"ContainerDied","Data":"b0337e79c24fb11aa755a2a464f61de22b18cda479f725824638dae8ee5c82f7"} Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.147197 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2e16-account-create-update-stnh7" event={"ID":"296e275b-fc1b-4946-a4f2-2d61fac9aff8","Type":"ContainerStarted","Data":"55a150eb8e7bacd4939ca1bcc9002fb31ee8eab5ca223baaae6c3f9c484b3eb5"} Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.153826 4728 generic.go:334] "Generic (PLEG): container finished" podID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerID="ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7" exitCode=137 Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.153993 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-589dd4bc84-6zndr" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.154057 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-589dd4bc84-6zndr" event={"ID":"f33646e3-23f5-40a1-88ef-f55bdd5a230c","Type":"ContainerDied","Data":"ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7"} Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.154099 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-589dd4bc84-6zndr" event={"ID":"f33646e3-23f5-40a1-88ef-f55bdd5a230c","Type":"ContainerDied","Data":"f02920902c541c62d745a8a4b7de35807d1e8663e2730e585d9f0126a6e8e341"} Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.154118 4728 scope.go:117] "RemoveContainer" containerID="416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.204836 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-54bb7475-hxsvl" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.242535 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-589dd4bc84-6zndr"] Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.251328 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-589dd4bc84-6zndr"] Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.398658 4728 scope.go:117] "RemoveContainer" containerID="ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.465563 4728 scope.go:117] "RemoveContainer" containerID="416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770" Dec 16 15:15:39 crc kubenswrapper[4728]: E1216 15:15:39.466446 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770\": container with ID starting with 416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770 not found: ID does not exist" containerID="416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.466513 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770"} err="failed to get container status \"416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770\": rpc error: code = NotFound desc = could not find container \"416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770\": container with ID starting with 416c87cdfebda4039d34de70fad75d6ff9b34d1ceff5f616170da07fce9de770 not found: ID does not exist" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.466539 4728 scope.go:117] "RemoveContainer" containerID="ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7" Dec 16 15:15:39 crc kubenswrapper[4728]: E1216 15:15:39.466957 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7\": container with ID starting with ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7 not found: ID does not exist" containerID="ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.466980 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7"} err="failed to get container status \"ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7\": rpc error: code = NotFound desc = could not find container \"ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7\": container with ID starting with ddcaff3be22b8f414506d5b3ee1098549482ec35fb7337f8347cca704c4ca3f7 not found: ID does not exist" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.534469 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" path="/var/lib/kubelet/pods/f33646e3-23f5-40a1-88ef-f55bdd5a230c/volumes" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.597188 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gbntn" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.643844 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tclcp\" (UniqueName: \"kubernetes.io/projected/887aeac4-be08-4765-ac4c-1f2ac326d1a3-kube-api-access-tclcp\") pod \"887aeac4-be08-4765-ac4c-1f2ac326d1a3\" (UID: \"887aeac4-be08-4765-ac4c-1f2ac326d1a3\") " Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.643903 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/887aeac4-be08-4765-ac4c-1f2ac326d1a3-operator-scripts\") pod \"887aeac4-be08-4765-ac4c-1f2ac326d1a3\" (UID: \"887aeac4-be08-4765-ac4c-1f2ac326d1a3\") " Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.645275 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887aeac4-be08-4765-ac4c-1f2ac326d1a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "887aeac4-be08-4765-ac4c-1f2ac326d1a3" (UID: "887aeac4-be08-4765-ac4c-1f2ac326d1a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.650221 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/887aeac4-be08-4765-ac4c-1f2ac326d1a3-kube-api-access-tclcp" (OuterVolumeSpecName: "kube-api-access-tclcp") pod "887aeac4-be08-4765-ac4c-1f2ac326d1a3" (UID: "887aeac4-be08-4765-ac4c-1f2ac326d1a3"). InnerVolumeSpecName "kube-api-access-tclcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.746200 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tclcp\" (UniqueName: \"kubernetes.io/projected/887aeac4-be08-4765-ac4c-1f2ac326d1a3-kube-api-access-tclcp\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:39 crc kubenswrapper[4728]: I1216 15:15:39.746233 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/887aeac4-be08-4765-ac4c-1f2ac326d1a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.161521 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gbntn" event={"ID":"887aeac4-be08-4765-ac4c-1f2ac326d1a3","Type":"ContainerDied","Data":"eb2846a2e35474cc404c9bd70d04ee834d35f218d6714b84a231b8af8f0f2a55"} Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.161561 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb2846a2e35474cc404c9bd70d04ee834d35f218d6714b84a231b8af8f0f2a55" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.161648 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gbntn" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.163815 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9d9zb" event={"ID":"f82109b1-c2b6-462c-8857-d0d8b243f64a","Type":"ContainerStarted","Data":"5e1e9b3b604f67a60129366077364666dd5a725f1f9dafc89251d3e837096e8c"} Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.185149 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9d9zb" podStartSLOduration=2.995501104 podStartE2EDuration="1m36.185129886s" podCreationTimestamp="2025-12-16 15:14:04 +0000 UTC" firstStartedPulling="2025-12-16 15:14:05.830450423 +0000 UTC m=+1026.670629417" lastFinishedPulling="2025-12-16 15:15:39.020079215 +0000 UTC m=+1119.860258199" observedRunningTime="2025-12-16 15:15:40.182570506 +0000 UTC m=+1121.022749480" watchObservedRunningTime="2025-12-16 15:15:40.185129886 +0000 UTC m=+1121.025308870" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.571460 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.705629 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrwl8\" (UniqueName: \"kubernetes.io/projected/56aab1b9-1cbf-4647-b025-581f674334d6-kube-api-access-xrwl8\") pod \"56aab1b9-1cbf-4647-b025-581f674334d6\" (UID: \"56aab1b9-1cbf-4647-b025-581f674334d6\") " Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.705699 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56aab1b9-1cbf-4647-b025-581f674334d6-operator-scripts\") pod \"56aab1b9-1cbf-4647-b025-581f674334d6\" (UID: \"56aab1b9-1cbf-4647-b025-581f674334d6\") " Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.706922 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56aab1b9-1cbf-4647-b025-581f674334d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56aab1b9-1cbf-4647-b025-581f674334d6" (UID: "56aab1b9-1cbf-4647-b025-581f674334d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.718137 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56aab1b9-1cbf-4647-b025-581f674334d6-kube-api-access-xrwl8" (OuterVolumeSpecName: "kube-api-access-xrwl8") pod "56aab1b9-1cbf-4647-b025-581f674334d6" (UID: "56aab1b9-1cbf-4647-b025-581f674334d6"). InnerVolumeSpecName "kube-api-access-xrwl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.808479 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrwl8\" (UniqueName: \"kubernetes.io/projected/56aab1b9-1cbf-4647-b025-581f674334d6-kube-api-access-xrwl8\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.808511 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56aab1b9-1cbf-4647-b025-581f674334d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.819994 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6ee5-account-create-update-6k7r6" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.897485 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-km67h" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.912906 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9m8j\" (UniqueName: \"kubernetes.io/projected/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-kube-api-access-t9m8j\") pod \"2be26ac5-6df6-4245-abaa-07c0e6fcdffd\" (UID: \"2be26ac5-6df6-4245-abaa-07c0e6fcdffd\") " Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.913118 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-operator-scripts\") pod \"2be26ac5-6df6-4245-abaa-07c0e6fcdffd\" (UID: \"2be26ac5-6df6-4245-abaa-07c0e6fcdffd\") " Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.919941 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-kube-api-access-t9m8j" (OuterVolumeSpecName: "kube-api-access-t9m8j") pod "2be26ac5-6df6-4245-abaa-07c0e6fcdffd" (UID: "2be26ac5-6df6-4245-abaa-07c0e6fcdffd"). InnerVolumeSpecName "kube-api-access-t9m8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.920044 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2be26ac5-6df6-4245-abaa-07c0e6fcdffd" (UID: "2be26ac5-6df6-4245-abaa-07c0e6fcdffd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.937808 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2e16-account-create-update-stnh7" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.958481 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:15:40 crc kubenswrapper[4728]: I1216 15:15:40.959134 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-swl7z" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.016019 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddkdp\" (UniqueName: \"kubernetes.io/projected/296e275b-fc1b-4946-a4f2-2d61fac9aff8-kube-api-access-ddkdp\") pod \"296e275b-fc1b-4946-a4f2-2d61fac9aff8\" (UID: \"296e275b-fc1b-4946-a4f2-2d61fac9aff8\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.016154 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/381f4a37-75d0-4da2-a183-875b9bc481aa-operator-scripts\") pod \"381f4a37-75d0-4da2-a183-875b9bc481aa\" (UID: \"381f4a37-75d0-4da2-a183-875b9bc481aa\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.016239 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgzpq\" (UniqueName: \"kubernetes.io/projected/381f4a37-75d0-4da2-a183-875b9bc481aa-kube-api-access-lgzpq\") pod \"381f4a37-75d0-4da2-a183-875b9bc481aa\" (UID: \"381f4a37-75d0-4da2-a183-875b9bc481aa\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.016334 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/296e275b-fc1b-4946-a4f2-2d61fac9aff8-operator-scripts\") pod \"296e275b-fc1b-4946-a4f2-2d61fac9aff8\" (UID: \"296e275b-fc1b-4946-a4f2-2d61fac9aff8\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.016797 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.016849 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9m8j\" (UniqueName: \"kubernetes.io/projected/2be26ac5-6df6-4245-abaa-07c0e6fcdffd-kube-api-access-t9m8j\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.017382 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/296e275b-fc1b-4946-a4f2-2d61fac9aff8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "296e275b-fc1b-4946-a4f2-2d61fac9aff8" (UID: "296e275b-fc1b-4946-a4f2-2d61fac9aff8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.017680 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/381f4a37-75d0-4da2-a183-875b9bc481aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "381f4a37-75d0-4da2-a183-875b9bc481aa" (UID: "381f4a37-75d0-4da2-a183-875b9bc481aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.020830 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296e275b-fc1b-4946-a4f2-2d61fac9aff8-kube-api-access-ddkdp" (OuterVolumeSpecName: "kube-api-access-ddkdp") pod "296e275b-fc1b-4946-a4f2-2d61fac9aff8" (UID: "296e275b-fc1b-4946-a4f2-2d61fac9aff8"). InnerVolumeSpecName "kube-api-access-ddkdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.023490 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/381f4a37-75d0-4da2-a183-875b9bc481aa-kube-api-access-lgzpq" (OuterVolumeSpecName: "kube-api-access-lgzpq") pod "381f4a37-75d0-4da2-a183-875b9bc481aa" (UID: "381f4a37-75d0-4da2-a183-875b9bc481aa"). InnerVolumeSpecName "kube-api-access-lgzpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.117845 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-config-data\") pod \"6300b826-1fb2-439e-b26f-fabdbc0aef58\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.117894 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45vm7\" (UniqueName: \"kubernetes.io/projected/6300b826-1fb2-439e-b26f-fabdbc0aef58-kube-api-access-45vm7\") pod \"6300b826-1fb2-439e-b26f-fabdbc0aef58\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.117929 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-log-httpd\") pod \"6300b826-1fb2-439e-b26f-fabdbc0aef58\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.118625 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6300b826-1fb2-439e-b26f-fabdbc0aef58" (UID: "6300b826-1fb2-439e-b26f-fabdbc0aef58"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.118690 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6300b826-1fb2-439e-b26f-fabdbc0aef58" (UID: "6300b826-1fb2-439e-b26f-fabdbc0aef58"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.118723 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-run-httpd\") pod \"6300b826-1fb2-439e-b26f-fabdbc0aef58\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.118773 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqqw7\" (UniqueName: \"kubernetes.io/projected/c269bf0a-1104-49e8-8c99-9ce6926f55c2-kube-api-access-gqqw7\") pod \"c269bf0a-1104-49e8-8c99-9ce6926f55c2\" (UID: \"c269bf0a-1104-49e8-8c99-9ce6926f55c2\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.118803 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-combined-ca-bundle\") pod \"6300b826-1fb2-439e-b26f-fabdbc0aef58\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.119080 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-scripts\") pod \"6300b826-1fb2-439e-b26f-fabdbc0aef58\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.119196 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c269bf0a-1104-49e8-8c99-9ce6926f55c2-operator-scripts\") pod \"c269bf0a-1104-49e8-8c99-9ce6926f55c2\" (UID: \"c269bf0a-1104-49e8-8c99-9ce6926f55c2\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.119233 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-sg-core-conf-yaml\") pod \"6300b826-1fb2-439e-b26f-fabdbc0aef58\" (UID: \"6300b826-1fb2-439e-b26f-fabdbc0aef58\") " Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.119707 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/296e275b-fc1b-4946-a4f2-2d61fac9aff8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.119735 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddkdp\" (UniqueName: \"kubernetes.io/projected/296e275b-fc1b-4946-a4f2-2d61fac9aff8-kube-api-access-ddkdp\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.119747 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/381f4a37-75d0-4da2-a183-875b9bc481aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.119760 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.119772 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6300b826-1fb2-439e-b26f-fabdbc0aef58-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.119783 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgzpq\" (UniqueName: \"kubernetes.io/projected/381f4a37-75d0-4da2-a183-875b9bc481aa-kube-api-access-lgzpq\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.120513 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c269bf0a-1104-49e8-8c99-9ce6926f55c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c269bf0a-1104-49e8-8c99-9ce6926f55c2" (UID: "c269bf0a-1104-49e8-8c99-9ce6926f55c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.122241 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c269bf0a-1104-49e8-8c99-9ce6926f55c2-kube-api-access-gqqw7" (OuterVolumeSpecName: "kube-api-access-gqqw7") pod "c269bf0a-1104-49e8-8c99-9ce6926f55c2" (UID: "c269bf0a-1104-49e8-8c99-9ce6926f55c2"). InnerVolumeSpecName "kube-api-access-gqqw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.122883 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-scripts" (OuterVolumeSpecName: "scripts") pod "6300b826-1fb2-439e-b26f-fabdbc0aef58" (UID: "6300b826-1fb2-439e-b26f-fabdbc0aef58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.124605 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6300b826-1fb2-439e-b26f-fabdbc0aef58-kube-api-access-45vm7" (OuterVolumeSpecName: "kube-api-access-45vm7") pod "6300b826-1fb2-439e-b26f-fabdbc0aef58" (UID: "6300b826-1fb2-439e-b26f-fabdbc0aef58"). InnerVolumeSpecName "kube-api-access-45vm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.147630 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6300b826-1fb2-439e-b26f-fabdbc0aef58" (UID: "6300b826-1fb2-439e-b26f-fabdbc0aef58"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.177210 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" event={"ID":"56aab1b9-1cbf-4647-b025-581f674334d6","Type":"ContainerDied","Data":"995de42443cf1285fa2d0cc3a517837f65b82516220289cb95cd69aab842232f"} Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.178280 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="995de42443cf1285fa2d0cc3a517837f65b82516220289cb95cd69aab842232f" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.177239 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7b52-account-create-update-q2f6l" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.179579 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2e16-account-create-update-stnh7" event={"ID":"296e275b-fc1b-4946-a4f2-2d61fac9aff8","Type":"ContainerDied","Data":"55a150eb8e7bacd4939ca1bcc9002fb31ee8eab5ca223baaae6c3f9c484b3eb5"} Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.179611 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55a150eb8e7bacd4939ca1bcc9002fb31ee8eab5ca223baaae6c3f9c484b3eb5" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.181751 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6ee5-account-create-update-6k7r6" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.182037 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6ee5-account-create-update-6k7r6" event={"ID":"2be26ac5-6df6-4245-abaa-07c0e6fcdffd","Type":"ContainerDied","Data":"9861c398347e2d2089a1ed96c0912175d56df6a482911aeb5afa886230743dac"} Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.182360 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9861c398347e2d2089a1ed96c0912175d56df6a482911aeb5afa886230743dac" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.182587 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2e16-account-create-update-stnh7" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.185110 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-km67h" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.185145 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-km67h" event={"ID":"381f4a37-75d0-4da2-a183-875b9bc481aa","Type":"ContainerDied","Data":"2ea93755c64e305d3ac4cac5d841b6fc91af530a710c2dddd686349bde25a732"} Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.185176 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ea93755c64e305d3ac4cac5d841b6fc91af530a710c2dddd686349bde25a732" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.187491 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6300b826-1fb2-439e-b26f-fabdbc0aef58" (UID: "6300b826-1fb2-439e-b26f-fabdbc0aef58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.188187 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-swl7z" event={"ID":"c269bf0a-1104-49e8-8c99-9ce6926f55c2","Type":"ContainerDied","Data":"13ddd0db5dbdc491cded27b8cb7a39a22ca7a4fe8775e15d2a3efc647d0f1303"} Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.188249 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13ddd0db5dbdc491cded27b8cb7a39a22ca7a4fe8775e15d2a3efc647d0f1303" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.188295 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-swl7z" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.197539 4728 generic.go:334] "Generic (PLEG): container finished" podID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerID="9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc" exitCode=0 Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.197599 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6300b826-1fb2-439e-b26f-fabdbc0aef58","Type":"ContainerDied","Data":"9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc"} Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.197633 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6300b826-1fb2-439e-b26f-fabdbc0aef58","Type":"ContainerDied","Data":"71b34b4ce1e26772c4d432b67af6701c4f3d7c5196df452a7c0a13586fe861e4"} Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.197652 4728 scope.go:117] "RemoveContainer" containerID="912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.197879 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.221376 4728 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c269bf0a-1104-49e8-8c99-9ce6926f55c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.221425 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.221439 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45vm7\" (UniqueName: \"kubernetes.io/projected/6300b826-1fb2-439e-b26f-fabdbc0aef58-kube-api-access-45vm7\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.221452 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqqw7\" (UniqueName: \"kubernetes.io/projected/c269bf0a-1104-49e8-8c99-9ce6926f55c2-kube-api-access-gqqw7\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.221462 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.221473 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.221847 4728 scope.go:117] "RemoveContainer" containerID="056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.241330 4728 scope.go:117] "RemoveContainer" containerID="9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.250539 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-config-data" (OuterVolumeSpecName: "config-data") pod "6300b826-1fb2-439e-b26f-fabdbc0aef58" (UID: "6300b826-1fb2-439e-b26f-fabdbc0aef58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.268191 4728 scope.go:117] "RemoveContainer" containerID="8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.290357 4728 scope.go:117] "RemoveContainer" containerID="912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.290805 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75\": container with ID starting with 912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75 not found: ID does not exist" containerID="912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.290852 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75"} err="failed to get container status \"912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75\": rpc error: code = NotFound desc = could not find container \"912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75\": container with ID starting with 912b0a189a3dffd665e062cfbf339fa92d973a3865852be62d9a9ba329b75b75 not found: ID does not exist" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.290909 4728 scope.go:117] "RemoveContainer" containerID="056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.291418 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc\": container with ID starting with 056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc not found: ID does not exist" containerID="056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.291442 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc"} err="failed to get container status \"056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc\": rpc error: code = NotFound desc = could not find container \"056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc\": container with ID starting with 056e959cbec29a22efd892db54f05c2e6a7d980567a54d7916bb33821cc4e4cc not found: ID does not exist" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.291457 4728 scope.go:117] "RemoveContainer" containerID="9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.291870 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc\": container with ID starting with 9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc not found: ID does not exist" containerID="9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.291889 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc"} err="failed to get container status \"9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc\": rpc error: code = NotFound desc = could not find container \"9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc\": container with ID starting with 9712bd358a8e03427b8942231b3df866f07af86aaa38d9efa9a08f577d7248dc not found: ID does not exist" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.291900 4728 scope.go:117] "RemoveContainer" containerID="8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.292270 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7\": container with ID starting with 8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7 not found: ID does not exist" containerID="8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.292291 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7"} err="failed to get container status \"8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7\": rpc error: code = NotFound desc = could not find container \"8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7\": container with ID starting with 8db2da414d25d2c977a873298f4f1fc6ddf9c272b6b6de773d0d0d8221bc83f7 not found: ID does not exist" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.323732 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6300b826-1fb2-439e-b26f-fabdbc0aef58-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.614746 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.632020 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.647208 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.647758 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887aeac4-be08-4765-ac4c-1f2ac326d1a3" containerName="mariadb-database-create" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.647787 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="887aeac4-be08-4765-ac4c-1f2ac326d1a3" containerName="mariadb-database-create" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.647797 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c269bf0a-1104-49e8-8c99-9ce6926f55c2" containerName="mariadb-database-create" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.647806 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c269bf0a-1104-49e8-8c99-9ce6926f55c2" containerName="mariadb-database-create" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.647822 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be26ac5-6df6-4245-abaa-07c0e6fcdffd" containerName="mariadb-account-create-update" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.647831 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be26ac5-6df6-4245-abaa-07c0e6fcdffd" containerName="mariadb-account-create-update" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.647848 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon-log" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.647857 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon-log" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.647872 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56aab1b9-1cbf-4647-b025-581f674334d6" containerName="mariadb-account-create-update" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.647882 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="56aab1b9-1cbf-4647-b025-581f674334d6" containerName="mariadb-account-create-update" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.647903 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="ceilometer-notification-agent" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.647912 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="ceilometer-notification-agent" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.647931 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="ceilometer-central-agent" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.647942 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="ceilometer-central-agent" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.647964 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="sg-core" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.647975 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="sg-core" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.647992 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="proxy-httpd" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648002 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="proxy-httpd" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.648014 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381f4a37-75d0-4da2-a183-875b9bc481aa" containerName="mariadb-database-create" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648022 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="381f4a37-75d0-4da2-a183-875b9bc481aa" containerName="mariadb-database-create" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.648035 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648043 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon" Dec 16 15:15:41 crc kubenswrapper[4728]: E1216 15:15:41.648062 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296e275b-fc1b-4946-a4f2-2d61fac9aff8" containerName="mariadb-account-create-update" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648072 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="296e275b-fc1b-4946-a4f2-2d61fac9aff8" containerName="mariadb-account-create-update" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648334 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon-log" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648359 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c269bf0a-1104-49e8-8c99-9ce6926f55c2" containerName="mariadb-database-create" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648376 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="887aeac4-be08-4765-ac4c-1f2ac326d1a3" containerName="mariadb-database-create" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648389 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="proxy-httpd" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648398 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="ceilometer-notification-agent" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648458 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be26ac5-6df6-4245-abaa-07c0e6fcdffd" containerName="mariadb-account-create-update" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648477 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="sg-core" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648492 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="296e275b-fc1b-4946-a4f2-2d61fac9aff8" containerName="mariadb-account-create-update" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648509 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" containerName="ceilometer-central-agent" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648519 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="381f4a37-75d0-4da2-a183-875b9bc481aa" containerName="mariadb-database-create" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648534 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33646e3-23f5-40a1-88ef-f55bdd5a230c" containerName="horizon" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.648550 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="56aab1b9-1cbf-4647-b025-581f674334d6" containerName="mariadb-account-create-update" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.650835 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.654065 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.684366 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.685395 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.730098 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-config-data\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.730164 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.730228 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-run-httpd\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.730286 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.730609 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-log-httpd\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.730674 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmv9q\" (UniqueName: \"kubernetes.io/projected/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-kube-api-access-nmv9q\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.730744 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-scripts\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.833215 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-config-data\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.833335 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.833440 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-run-httpd\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.833536 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.833786 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-log-httpd\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.833851 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmv9q\" (UniqueName: \"kubernetes.io/projected/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-kube-api-access-nmv9q\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.833924 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-scripts\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.835035 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-run-httpd\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.835065 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-log-httpd\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.838115 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-config-data\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.838823 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.838945 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.840773 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-scripts\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.880160 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmv9q\" (UniqueName: \"kubernetes.io/projected/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-kube-api-access-nmv9q\") pod \"ceilometer-0\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " pod="openstack/ceilometer-0" Dec 16 15:15:41 crc kubenswrapper[4728]: I1216 15:15:41.994091 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:15:42 crc kubenswrapper[4728]: I1216 15:15:42.459582 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:43 crc kubenswrapper[4728]: I1216 15:15:43.216138 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb","Type":"ContainerStarted","Data":"246bcbf1c1a0a4e3509e8170b0f78e99d7e9888cc2843f2e4e0c942d7b021dc0"} Dec 16 15:15:43 crc kubenswrapper[4728]: I1216 15:15:43.525979 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6300b826-1fb2-439e-b26f-fabdbc0aef58" path="/var/lib/kubelet/pods/6300b826-1fb2-439e-b26f-fabdbc0aef58/volumes" Dec 16 15:15:45 crc kubenswrapper[4728]: I1216 15:15:45.240724 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb","Type":"ContainerStarted","Data":"196597e6768245547f66f20e4f2e75c81d874be15b2ddc830f3b1e68902864f4"} Dec 16 15:15:45 crc kubenswrapper[4728]: I1216 15:15:45.242866 4728 generic.go:334] "Generic (PLEG): container finished" podID="f82109b1-c2b6-462c-8857-d0d8b243f64a" containerID="5e1e9b3b604f67a60129366077364666dd5a725f1f9dafc89251d3e837096e8c" exitCode=0 Dec 16 15:15:45 crc kubenswrapper[4728]: I1216 15:15:45.242920 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9d9zb" event={"ID":"f82109b1-c2b6-462c-8857-d0d8b243f64a","Type":"ContainerDied","Data":"5e1e9b3b604f67a60129366077364666dd5a725f1f9dafc89251d3e837096e8c"} Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.258861 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb","Type":"ContainerStarted","Data":"39cc123c9ab5b1c3549fd536272a73a942013a815832bee0783d90955fec8d40"} Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.482765 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.681252 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.728064 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-config-data\") pod \"f82109b1-c2b6-462c-8857-d0d8b243f64a\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.728216 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82109b1-c2b6-462c-8857-d0d8b243f64a-etc-machine-id\") pod \"f82109b1-c2b6-462c-8857-d0d8b243f64a\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.728249 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-db-sync-config-data\") pod \"f82109b1-c2b6-462c-8857-d0d8b243f64a\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.728267 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-scripts\") pod \"f82109b1-c2b6-462c-8857-d0d8b243f64a\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.728354 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-combined-ca-bundle\") pod \"f82109b1-c2b6-462c-8857-d0d8b243f64a\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.728388 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mzdz\" (UniqueName: \"kubernetes.io/projected/f82109b1-c2b6-462c-8857-d0d8b243f64a-kube-api-access-5mzdz\") pod \"f82109b1-c2b6-462c-8857-d0d8b243f64a\" (UID: \"f82109b1-c2b6-462c-8857-d0d8b243f64a\") " Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.729706 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f82109b1-c2b6-462c-8857-d0d8b243f64a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f82109b1-c2b6-462c-8857-d0d8b243f64a" (UID: "f82109b1-c2b6-462c-8857-d0d8b243f64a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.748175 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f82109b1-c2b6-462c-8857-d0d8b243f64a-kube-api-access-5mzdz" (OuterVolumeSpecName: "kube-api-access-5mzdz") pod "f82109b1-c2b6-462c-8857-d0d8b243f64a" (UID: "f82109b1-c2b6-462c-8857-d0d8b243f64a"). InnerVolumeSpecName "kube-api-access-5mzdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.753847 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f82109b1-c2b6-462c-8857-d0d8b243f64a" (UID: "f82109b1-c2b6-462c-8857-d0d8b243f64a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.755098 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f82109b1-c2b6-462c-8857-d0d8b243f64a" (UID: "f82109b1-c2b6-462c-8857-d0d8b243f64a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.766643 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-scripts" (OuterVolumeSpecName: "scripts") pod "f82109b1-c2b6-462c-8857-d0d8b243f64a" (UID: "f82109b1-c2b6-462c-8857-d0d8b243f64a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.809833 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-config-data" (OuterVolumeSpecName: "config-data") pod "f82109b1-c2b6-462c-8857-d0d8b243f64a" (UID: "f82109b1-c2b6-462c-8857-d0d8b243f64a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.832753 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.832799 4728 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f82109b1-c2b6-462c-8857-d0d8b243f64a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.832823 4728 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.832834 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.832847 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82109b1-c2b6-462c-8857-d0d8b243f64a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:46 crc kubenswrapper[4728]: I1216 15:15:46.832865 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mzdz\" (UniqueName: \"kubernetes.io/projected/f82109b1-c2b6-462c-8857-d0d8b243f64a-kube-api-access-5mzdz\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.268492 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9d9zb" event={"ID":"f82109b1-c2b6-462c-8857-d0d8b243f64a","Type":"ContainerDied","Data":"602f0ca44a3ff7f8c6da99b8662d66dc3eede1e66157a97b462bc306b9102e68"} Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.268539 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="602f0ca44a3ff7f8c6da99b8662d66dc3eede1e66157a97b462bc306b9102e68" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.268619 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9d9zb" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.299083 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bh899"] Dec 16 15:15:47 crc kubenswrapper[4728]: E1216 15:15:47.299645 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82109b1-c2b6-462c-8857-d0d8b243f64a" containerName="cinder-db-sync" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.299671 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82109b1-c2b6-462c-8857-d0d8b243f64a" containerName="cinder-db-sync" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.299957 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f82109b1-c2b6-462c-8857-d0d8b243f64a" containerName="cinder-db-sync" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.300822 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.302765 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zmrxn" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.302994 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.303846 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.321741 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bh899"] Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.387890 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.398915 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7dcd7544cd-gnxgg" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.442687 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-scripts\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.442807 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-config-data\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.442868 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvfwd\" (UniqueName: \"kubernetes.io/projected/3d185e68-d66c-438b-b4c2-bde356e4313e-kube-api-access-hvfwd\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.442922 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.544053 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.545655 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-scripts\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.545747 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-config-data\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.545820 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvfwd\" (UniqueName: \"kubernetes.io/projected/3d185e68-d66c-438b-b4c2-bde356e4313e-kube-api-access-hvfwd\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.545871 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.546062 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.551815 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-config-data\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.552397 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.552688 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.552820 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-k7j25" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.552971 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.564095 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.575723 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-scripts\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.580192 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvfwd\" (UniqueName: \"kubernetes.io/projected/3d185e68-d66c-438b-b4c2-bde356e4313e-kube-api-access-hvfwd\") pod \"nova-cell0-conductor-db-sync-bh899\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.580367 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.627770 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jx95k"] Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.630775 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.631348 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.652129 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.652185 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.652224 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-scripts\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.652290 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.652311 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.652417 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swv6h\" (UniqueName: \"kubernetes.io/projected/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-kube-api-access-swv6h\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.668661 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jx95k"] Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756293 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756331 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756372 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756399 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-config\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756487 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756517 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4z6q\" (UniqueName: \"kubernetes.io/projected/eae8087e-b189-4db5-b646-d421f63a4828-kube-api-access-w4z6q\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756545 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swv6h\" (UniqueName: \"kubernetes.io/projected/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-kube-api-access-swv6h\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756804 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756822 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756839 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756879 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.756911 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-scripts\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.757348 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.760883 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.766828 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-scripts\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.769807 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.771431 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.773190 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.773290 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.779335 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.784957 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swv6h\" (UniqueName: \"kubernetes.io/projected/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-kube-api-access-swv6h\") pod \"cinder-scheduler-0\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.800802 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.820239 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.858864 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.858902 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-config\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.858939 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4z6q\" (UniqueName: \"kubernetes.io/projected/eae8087e-b189-4db5-b646-d421f63a4828-kube-api-access-w4z6q\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.858970 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-scripts\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.858999 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwvn4\" (UniqueName: \"kubernetes.io/projected/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-kube-api-access-vwvn4\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.859018 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.859056 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.859094 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-logs\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.859111 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.859133 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.859169 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data-custom\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.859225 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.859264 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.860782 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.860868 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.861683 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.861832 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.861939 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-config\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.880083 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4z6q\" (UniqueName: \"kubernetes.io/projected/eae8087e-b189-4db5-b646-d421f63a4828-kube-api-access-w4z6q\") pod \"dnsmasq-dns-5c9776ccc5-jx95k\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.960869 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.961071 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-scripts\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.961113 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwvn4\" (UniqueName: \"kubernetes.io/projected/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-kube-api-access-vwvn4\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.961150 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.961187 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-logs\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.961226 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.961306 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data-custom\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.962669 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.963057 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-logs\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.967945 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data-custom\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.968642 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.969124 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.971178 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-scripts\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:47 crc kubenswrapper[4728]: I1216 15:15:47.977854 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwvn4\" (UniqueName: \"kubernetes.io/projected/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-kube-api-access-vwvn4\") pod \"cinder-api-0\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " pod="openstack/cinder-api-0" Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.139895 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.143840 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.233269 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bh899"] Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.292688 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb","Type":"ContainerStarted","Data":"55bb5fa9c2353c08711c33c17e5a2614662a29bd01c2f32e4189126d76d69fb4"} Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.427300 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:15:48 crc kubenswrapper[4728]: W1216 15:15:48.843548 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeae8087e_b189_4db5_b646_d421f63a4828.slice/crio-72d541b05fca4293c3199a0c11d3048f323320a3ad5dfb55c21e7120e14f8a3b WatchSource:0}: Error finding container 72d541b05fca4293c3199a0c11d3048f323320a3ad5dfb55c21e7120e14f8a3b: Status 404 returned error can't find the container with id 72d541b05fca4293c3199a0c11d3048f323320a3ad5dfb55c21e7120e14f8a3b Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.845448 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jx95k"] Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.860990 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.883865 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-79c9d99cd5-967vg" Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.962916 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bb7b6f474-4bf42"] Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.963170 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bb7b6f474-4bf42" podUID="8afc45c2-e8b6-4886-aa09-87ff2f284587" containerName="neutron-api" containerID="cri-o://5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df" gracePeriod=30 Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.963314 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bb7b6f474-4bf42" podUID="8afc45c2-e8b6-4886-aa09-87ff2f284587" containerName="neutron-httpd" containerID="cri-o://480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9" gracePeriod=30 Dec 16 15:15:48 crc kubenswrapper[4728]: I1216 15:15:48.979203 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:15:49 crc kubenswrapper[4728]: I1216 15:15:49.305215 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bh899" event={"ID":"3d185e68-d66c-438b-b4c2-bde356e4313e","Type":"ContainerStarted","Data":"ba46741d82bedf9dc4a421ce74f7877b01c58d1441c39c9af9eadf0c1b33c299"} Dec 16 15:15:49 crc kubenswrapper[4728]: I1216 15:15:49.312278 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6","Type":"ContainerStarted","Data":"24adae403493fe5eae64b5a5c3b9c892e6d21cc3ed63a776af990401db24f794"} Dec 16 15:15:49 crc kubenswrapper[4728]: I1216 15:15:49.319505 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"877e164c-b0b1-4a39-bd9c-1f2432ed8e36","Type":"ContainerStarted","Data":"0f23b2dea167ea7ec9b6cd2b0729fddd6d53e57c86205dc08a1e8047865b244b"} Dec 16 15:15:49 crc kubenswrapper[4728]: I1216 15:15:49.320601 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" event={"ID":"eae8087e-b189-4db5-b646-d421f63a4828","Type":"ContainerStarted","Data":"72d541b05fca4293c3199a0c11d3048f323320a3ad5dfb55c21e7120e14f8a3b"} Dec 16 15:15:50 crc kubenswrapper[4728]: I1216 15:15:50.006458 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:50 crc kubenswrapper[4728]: I1216 15:15:50.007064 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerName="glance-log" containerID="cri-o://54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c" gracePeriod=30 Dec 16 15:15:50 crc kubenswrapper[4728]: I1216 15:15:50.008498 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerName="glance-httpd" containerID="cri-o://71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0" gracePeriod=30 Dec 16 15:15:50 crc kubenswrapper[4728]: I1216 15:15:50.341125 4728 generic.go:334] "Generic (PLEG): container finished" podID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerID="54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c" exitCode=143 Dec 16 15:15:50 crc kubenswrapper[4728]: I1216 15:15:50.341461 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"955f80b9-933a-4583-92b7-f11c5ccd1bec","Type":"ContainerDied","Data":"54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c"} Dec 16 15:15:50 crc kubenswrapper[4728]: I1216 15:15:50.342767 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6","Type":"ContainerStarted","Data":"fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2"} Dec 16 15:15:50 crc kubenswrapper[4728]: I1216 15:15:50.346271 4728 generic.go:334] "Generic (PLEG): container finished" podID="8afc45c2-e8b6-4886-aa09-87ff2f284587" containerID="480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9" exitCode=0 Dec 16 15:15:50 crc kubenswrapper[4728]: I1216 15:15:50.346334 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bb7b6f474-4bf42" event={"ID":"8afc45c2-e8b6-4886-aa09-87ff2f284587","Type":"ContainerDied","Data":"480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9"} Dec 16 15:15:50 crc kubenswrapper[4728]: I1216 15:15:50.348735 4728 generic.go:334] "Generic (PLEG): container finished" podID="eae8087e-b189-4db5-b646-d421f63a4828" containerID="36205ca294dacdd2683d7f17a5a6d62b99b7d37f0bf2c71b85db58eabfcd3d29" exitCode=0 Dec 16 15:15:50 crc kubenswrapper[4728]: I1216 15:15:50.348771 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" event={"ID":"eae8087e-b189-4db5-b646-d421f63a4828","Type":"ContainerDied","Data":"36205ca294dacdd2683d7f17a5a6d62b99b7d37f0bf2c71b85db58eabfcd3d29"} Dec 16 15:15:50 crc kubenswrapper[4728]: I1216 15:15:50.829602 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.258191 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.258721 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerName="glance-log" containerID="cri-o://c2f27f86e8a584c665cd7d6538155dc1abeea1e78c5d29a69d32d58406d404cd" gracePeriod=30 Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.259833 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerName="glance-httpd" containerID="cri-o://238866146b238760e52aa044cfbec6bb7112b3a6ebc78826658a15ac386aba0a" gracePeriod=30 Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.365596 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6","Type":"ContainerStarted","Data":"2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2"} Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.366647 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.369330 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb","Type":"ContainerStarted","Data":"65be674c5d4ece6b7da770926f2956f0f42c3bcb6bdee62bf3d217479d36fe55"} Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.369595 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.369606 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="ceilometer-central-agent" containerID="cri-o://196597e6768245547f66f20e4f2e75c81d874be15b2ddc830f3b1e68902864f4" gracePeriod=30 Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.369629 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="proxy-httpd" containerID="cri-o://65be674c5d4ece6b7da770926f2956f0f42c3bcb6bdee62bf3d217479d36fe55" gracePeriod=30 Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.369731 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="ceilometer-notification-agent" containerID="cri-o://39cc123c9ab5b1c3549fd536272a73a942013a815832bee0783d90955fec8d40" gracePeriod=30 Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.369785 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="sg-core" containerID="cri-o://55bb5fa9c2353c08711c33c17e5a2614662a29bd01c2f32e4189126d76d69fb4" gracePeriod=30 Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.376935 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" event={"ID":"eae8087e-b189-4db5-b646-d421f63a4828","Type":"ContainerStarted","Data":"1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56"} Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.377638 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.398001 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.397976587 podStartE2EDuration="4.397976587s" podCreationTimestamp="2025-12-16 15:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:51.387854051 +0000 UTC m=+1132.228033035" watchObservedRunningTime="2025-12-16 15:15:51.397976587 +0000 UTC m=+1132.238155571" Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.415460 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7557577970000002 podStartE2EDuration="10.415444391s" podCreationTimestamp="2025-12-16 15:15:41 +0000 UTC" firstStartedPulling="2025-12-16 15:15:42.465800883 +0000 UTC m=+1123.305979867" lastFinishedPulling="2025-12-16 15:15:50.125487487 +0000 UTC m=+1130.965666461" observedRunningTime="2025-12-16 15:15:51.413859388 +0000 UTC m=+1132.254038372" watchObservedRunningTime="2025-12-16 15:15:51.415444391 +0000 UTC m=+1132.255623365" Dec 16 15:15:51 crc kubenswrapper[4728]: I1216 15:15:51.443078 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" podStartSLOduration=4.443061152 podStartE2EDuration="4.443061152s" podCreationTimestamp="2025-12-16 15:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:15:51.435458635 +0000 UTC m=+1132.275637629" watchObservedRunningTime="2025-12-16 15:15:51.443061152 +0000 UTC m=+1132.283240136" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.344267 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.407446 4728 generic.go:334] "Generic (PLEG): container finished" podID="8afc45c2-e8b6-4886-aa09-87ff2f284587" containerID="5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df" exitCode=0 Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.407512 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bb7b6f474-4bf42" event={"ID":"8afc45c2-e8b6-4886-aa09-87ff2f284587","Type":"ContainerDied","Data":"5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df"} Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.407540 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bb7b6f474-4bf42" event={"ID":"8afc45c2-e8b6-4886-aa09-87ff2f284587","Type":"ContainerDied","Data":"60706bcc752a964d0b44edd92fa8b274b4fa219e87a05030ac3b15c2907c4636"} Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.407543 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bb7b6f474-4bf42" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.407557 4728 scope.go:117] "RemoveContainer" containerID="480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.412368 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"877e164c-b0b1-4a39-bd9c-1f2432ed8e36","Type":"ContainerStarted","Data":"7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372"} Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.412439 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"877e164c-b0b1-4a39-bd9c-1f2432ed8e36","Type":"ContainerStarted","Data":"89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b"} Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.418033 4728 generic.go:334] "Generic (PLEG): container finished" podID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerID="65be674c5d4ece6b7da770926f2956f0f42c3bcb6bdee62bf3d217479d36fe55" exitCode=0 Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.418059 4728 generic.go:334] "Generic (PLEG): container finished" podID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerID="55bb5fa9c2353c08711c33c17e5a2614662a29bd01c2f32e4189126d76d69fb4" exitCode=2 Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.418068 4728 generic.go:334] "Generic (PLEG): container finished" podID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerID="39cc123c9ab5b1c3549fd536272a73a942013a815832bee0783d90955fec8d40" exitCode=0 Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.418136 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb","Type":"ContainerDied","Data":"65be674c5d4ece6b7da770926f2956f0f42c3bcb6bdee62bf3d217479d36fe55"} Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.418195 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb","Type":"ContainerDied","Data":"55bb5fa9c2353c08711c33c17e5a2614662a29bd01c2f32e4189126d76d69fb4"} Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.418207 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb","Type":"ContainerDied","Data":"39cc123c9ab5b1c3549fd536272a73a942013a815832bee0783d90955fec8d40"} Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.425128 4728 generic.go:334] "Generic (PLEG): container finished" podID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerID="c2f27f86e8a584c665cd7d6538155dc1abeea1e78c5d29a69d32d58406d404cd" exitCode=143 Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.425234 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"916a6b2e-6b7b-457e-b2a2-80d02edc2217","Type":"ContainerDied","Data":"c2f27f86e8a584c665cd7d6538155dc1abeea1e78c5d29a69d32d58406d404cd"} Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.425320 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" containerName="cinder-api-log" containerID="cri-o://fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2" gracePeriod=30 Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.425444 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" containerName="cinder-api" containerID="cri-o://2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2" gracePeriod=30 Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.430559 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.896519575 podStartE2EDuration="5.430542427s" podCreationTimestamp="2025-12-16 15:15:47 +0000 UTC" firstStartedPulling="2025-12-16 15:15:48.456359468 +0000 UTC m=+1129.296538452" lastFinishedPulling="2025-12-16 15:15:50.99038232 +0000 UTC m=+1131.830561304" observedRunningTime="2025-12-16 15:15:52.428327107 +0000 UTC m=+1133.268506081" watchObservedRunningTime="2025-12-16 15:15:52.430542427 +0000 UTC m=+1133.270721411" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.450749 4728 scope.go:117] "RemoveContainer" containerID="5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.491989 4728 scope.go:117] "RemoveContainer" containerID="480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9" Dec 16 15:15:52 crc kubenswrapper[4728]: E1216 15:15:52.492453 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9\": container with ID starting with 480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9 not found: ID does not exist" containerID="480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.492501 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9"} err="failed to get container status \"480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9\": rpc error: code = NotFound desc = could not find container \"480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9\": container with ID starting with 480702e744fda23af83816df0e11583025f961140b4254d7d9b81ad996c99db9 not found: ID does not exist" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.492530 4728 scope.go:117] "RemoveContainer" containerID="5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df" Dec 16 15:15:52 crc kubenswrapper[4728]: E1216 15:15:52.495635 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df\": container with ID starting with 5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df not found: ID does not exist" containerID="5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.495668 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df"} err="failed to get container status \"5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df\": rpc error: code = NotFound desc = could not find container \"5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df\": container with ID starting with 5f7f618cd0f44c6b955bc7b6ab006be78656f4479e74966cf974e74066d1f7df not found: ID does not exist" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.506027 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-combined-ca-bundle\") pod \"8afc45c2-e8b6-4886-aa09-87ff2f284587\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.506153 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82tlw\" (UniqueName: \"kubernetes.io/projected/8afc45c2-e8b6-4886-aa09-87ff2f284587-kube-api-access-82tlw\") pod \"8afc45c2-e8b6-4886-aa09-87ff2f284587\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.506191 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-config\") pod \"8afc45c2-e8b6-4886-aa09-87ff2f284587\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.506346 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-ovndb-tls-certs\") pod \"8afc45c2-e8b6-4886-aa09-87ff2f284587\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.506425 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-httpd-config\") pod \"8afc45c2-e8b6-4886-aa09-87ff2f284587\" (UID: \"8afc45c2-e8b6-4886-aa09-87ff2f284587\") " Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.512602 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8afc45c2-e8b6-4886-aa09-87ff2f284587-kube-api-access-82tlw" (OuterVolumeSpecName: "kube-api-access-82tlw") pod "8afc45c2-e8b6-4886-aa09-87ff2f284587" (UID: "8afc45c2-e8b6-4886-aa09-87ff2f284587"). InnerVolumeSpecName "kube-api-access-82tlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.512691 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8afc45c2-e8b6-4886-aa09-87ff2f284587" (UID: "8afc45c2-e8b6-4886-aa09-87ff2f284587"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.593016 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-config" (OuterVolumeSpecName: "config") pod "8afc45c2-e8b6-4886-aa09-87ff2f284587" (UID: "8afc45c2-e8b6-4886-aa09-87ff2f284587"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.604079 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8afc45c2-e8b6-4886-aa09-87ff2f284587" (UID: "8afc45c2-e8b6-4886-aa09-87ff2f284587"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.604442 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8afc45c2-e8b6-4886-aa09-87ff2f284587" (UID: "8afc45c2-e8b6-4886-aa09-87ff2f284587"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.613397 4728 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.613510 4728 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.613529 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.613540 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82tlw\" (UniqueName: \"kubernetes.io/projected/8afc45c2-e8b6-4886-aa09-87ff2f284587-kube-api-access-82tlw\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.613550 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8afc45c2-e8b6-4886-aa09-87ff2f284587-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.818180 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bb7b6f474-4bf42"] Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.822595 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 15:15:52 crc kubenswrapper[4728]: I1216 15:15:52.826782 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6bb7b6f474-4bf42"] Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.091361 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.223871 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-etc-machine-id\") pod \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.224007 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-combined-ca-bundle\") pod \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.224095 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data\") pod \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.224138 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data-custom\") pod \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.224166 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-logs\") pod \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.224187 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwvn4\" (UniqueName: \"kubernetes.io/projected/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-kube-api-access-vwvn4\") pod \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.224272 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-scripts\") pod \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\" (UID: \"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6\") " Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.225900 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" (UID: "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.225969 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-logs" (OuterVolumeSpecName: "logs") pod "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" (UID: "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.230902 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-scripts" (OuterVolumeSpecName: "scripts") pod "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" (UID: "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.230997 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" (UID: "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.237205 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-kube-api-access-vwvn4" (OuterVolumeSpecName: "kube-api-access-vwvn4") pod "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" (UID: "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6"). InnerVolumeSpecName "kube-api-access-vwvn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.269171 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" (UID: "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.273498 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data" (OuterVolumeSpecName: "config-data") pod "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" (UID: "758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.327631 4728 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.327670 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.327683 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.327694 4728 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.327705 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.327715 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwvn4\" (UniqueName: \"kubernetes.io/projected/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-kube-api-access-vwvn4\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.327727 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.440200 4728 generic.go:334] "Generic (PLEG): container finished" podID="758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" containerID="2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2" exitCode=0 Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.440664 4728 generic.go:334] "Generic (PLEG): container finished" podID="758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" containerID="fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2" exitCode=143 Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.440376 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6","Type":"ContainerDied","Data":"2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2"} Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.440745 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6","Type":"ContainerDied","Data":"fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2"} Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.440766 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6","Type":"ContainerDied","Data":"24adae403493fe5eae64b5a5c3b9c892e6d21cc3ed63a776af990401db24f794"} Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.440790 4728 scope.go:117] "RemoveContainer" containerID="2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.440456 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.475356 4728 scope.go:117] "RemoveContainer" containerID="fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.481002 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.501904 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.521599 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" path="/var/lib/kubelet/pods/758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6/volumes" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.522160 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8afc45c2-e8b6-4886-aa09-87ff2f284587" path="/var/lib/kubelet/pods/8afc45c2-e8b6-4886-aa09-87ff2f284587/volumes" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.522968 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:15:53 crc kubenswrapper[4728]: E1216 15:15:53.523263 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8afc45c2-e8b6-4886-aa09-87ff2f284587" containerName="neutron-api" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.523278 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afc45c2-e8b6-4886-aa09-87ff2f284587" containerName="neutron-api" Dec 16 15:15:53 crc kubenswrapper[4728]: E1216 15:15:53.523295 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" containerName="cinder-api" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.523303 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" containerName="cinder-api" Dec 16 15:15:53 crc kubenswrapper[4728]: E1216 15:15:53.523322 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" containerName="cinder-api-log" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.523330 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" containerName="cinder-api-log" Dec 16 15:15:53 crc kubenswrapper[4728]: E1216 15:15:53.523357 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8afc45c2-e8b6-4886-aa09-87ff2f284587" containerName="neutron-httpd" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.523364 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afc45c2-e8b6-4886-aa09-87ff2f284587" containerName="neutron-httpd" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.523666 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8afc45c2-e8b6-4886-aa09-87ff2f284587" containerName="neutron-httpd" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.523680 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8afc45c2-e8b6-4886-aa09-87ff2f284587" containerName="neutron-api" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.523691 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" containerName="cinder-api" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.523702 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="758e3f1b-f4f5-4bb9-bbeb-aea09fc24ee6" containerName="cinder-api-log" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.525047 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.529385 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.536148 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": dial tcp 10.217.0.155:9292: connect: connection refused" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.536755 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.536976 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.537175 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": dial tcp 10.217.0.155:9292: connect: connection refused" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.537325 4728 scope.go:117] "RemoveContainer" containerID="2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2" Dec 16 15:15:53 crc kubenswrapper[4728]: E1216 15:15:53.540390 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2\": container with ID starting with 2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2 not found: ID does not exist" containerID="2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.540446 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2"} err="failed to get container status \"2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2\": rpc error: code = NotFound desc = could not find container \"2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2\": container with ID starting with 2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2 not found: ID does not exist" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.540476 4728 scope.go:117] "RemoveContainer" containerID="fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2" Dec 16 15:15:53 crc kubenswrapper[4728]: E1216 15:15:53.540850 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2\": container with ID starting with fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2 not found: ID does not exist" containerID="fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.540870 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2"} err="failed to get container status \"fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2\": rpc error: code = NotFound desc = could not find container \"fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2\": container with ID starting with fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2 not found: ID does not exist" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.540886 4728 scope.go:117] "RemoveContainer" containerID="2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.541239 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2"} err="failed to get container status \"2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2\": rpc error: code = NotFound desc = could not find container \"2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2\": container with ID starting with 2fbbfba83266d64204dd5e28c3074e0d23d9e157e4ec7282f3e2e5f3a0b6e3c2 not found: ID does not exist" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.541273 4728 scope.go:117] "RemoveContainer" containerID="fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.542014 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2"} err="failed to get container status \"fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2\": rpc error: code = NotFound desc = could not find container \"fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2\": container with ID starting with fa167ab3860af231cc50b1498a287d499d48d5af155167e13a673b39efca62b2 not found: ID does not exist" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.554180 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.636830 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-scripts\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.636919 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.637009 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75q47\" (UniqueName: \"kubernetes.io/projected/c3da261d-5106-45a2-a6c7-d5314450c0af-kube-api-access-75q47\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.637044 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.637074 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-config-data\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.637121 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3da261d-5106-45a2-a6c7-d5314450c0af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.637149 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3da261d-5106-45a2-a6c7-d5314450c0af-logs\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.637208 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.637346 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-config-data-custom\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.741370 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-config-data\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.741421 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3da261d-5106-45a2-a6c7-d5314450c0af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.741438 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3da261d-5106-45a2-a6c7-d5314450c0af-logs\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.741472 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.741534 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-config-data-custom\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.741556 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-scripts\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.741584 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.741621 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75q47\" (UniqueName: \"kubernetes.io/projected/c3da261d-5106-45a2-a6c7-d5314450c0af-kube-api-access-75q47\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.741643 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.742832 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3da261d-5106-45a2-a6c7-d5314450c0af-logs\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.742888 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3da261d-5106-45a2-a6c7-d5314450c0af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.749831 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-scripts\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.750012 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-config-data-custom\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.750451 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.750599 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-config-data\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.750935 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.752125 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3da261d-5106-45a2-a6c7-d5314450c0af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.768787 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75q47\" (UniqueName: \"kubernetes.io/projected/c3da261d-5106-45a2-a6c7-d5314450c0af-kube-api-access-75q47\") pod \"cinder-api-0\" (UID: \"c3da261d-5106-45a2-a6c7-d5314450c0af\") " pod="openstack/cinder-api-0" Dec 16 15:15:53 crc kubenswrapper[4728]: I1216 15:15:53.913599 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.019873 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.149095 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-public-tls-certs\") pod \"955f80b9-933a-4583-92b7-f11c5ccd1bec\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.149139 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f2l9\" (UniqueName: \"kubernetes.io/projected/955f80b9-933a-4583-92b7-f11c5ccd1bec-kube-api-access-7f2l9\") pod \"955f80b9-933a-4583-92b7-f11c5ccd1bec\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.149213 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-combined-ca-bundle\") pod \"955f80b9-933a-4583-92b7-f11c5ccd1bec\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.149309 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-config-data\") pod \"955f80b9-933a-4583-92b7-f11c5ccd1bec\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.149355 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"955f80b9-933a-4583-92b7-f11c5ccd1bec\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.149385 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-scripts\") pod \"955f80b9-933a-4583-92b7-f11c5ccd1bec\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.149470 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-httpd-run\") pod \"955f80b9-933a-4583-92b7-f11c5ccd1bec\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.149528 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-logs\") pod \"955f80b9-933a-4583-92b7-f11c5ccd1bec\" (UID: \"955f80b9-933a-4583-92b7-f11c5ccd1bec\") " Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.150950 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-logs" (OuterVolumeSpecName: "logs") pod "955f80b9-933a-4583-92b7-f11c5ccd1bec" (UID: "955f80b9-933a-4583-92b7-f11c5ccd1bec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.151315 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "955f80b9-933a-4583-92b7-f11c5ccd1bec" (UID: "955f80b9-933a-4583-92b7-f11c5ccd1bec"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.185172 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955f80b9-933a-4583-92b7-f11c5ccd1bec-kube-api-access-7f2l9" (OuterVolumeSpecName: "kube-api-access-7f2l9") pod "955f80b9-933a-4583-92b7-f11c5ccd1bec" (UID: "955f80b9-933a-4583-92b7-f11c5ccd1bec"). InnerVolumeSpecName "kube-api-access-7f2l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.186752 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-scripts" (OuterVolumeSpecName: "scripts") pod "955f80b9-933a-4583-92b7-f11c5ccd1bec" (UID: "955f80b9-933a-4583-92b7-f11c5ccd1bec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.186826 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "955f80b9-933a-4583-92b7-f11c5ccd1bec" (UID: "955f80b9-933a-4583-92b7-f11c5ccd1bec"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.191496 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "955f80b9-933a-4583-92b7-f11c5ccd1bec" (UID: "955f80b9-933a-4583-92b7-f11c5ccd1bec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.216032 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-config-data" (OuterVolumeSpecName: "config-data") pod "955f80b9-933a-4583-92b7-f11c5ccd1bec" (UID: "955f80b9-933a-4583-92b7-f11c5ccd1bec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.233126 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "955f80b9-933a-4583-92b7-f11c5ccd1bec" (UID: "955f80b9-933a-4583-92b7-f11c5ccd1bec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.251771 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.251819 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.251855 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.251868 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.251880 4728 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.251891 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/955f80b9-933a-4583-92b7-f11c5ccd1bec-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.251901 4728 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/955f80b9-933a-4583-92b7-f11c5ccd1bec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.251914 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f2l9\" (UniqueName: \"kubernetes.io/projected/955f80b9-933a-4583-92b7-f11c5ccd1bec-kube-api-access-7f2l9\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.273514 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.353557 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.457642 4728 generic.go:334] "Generic (PLEG): container finished" podID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerID="71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0" exitCode=0 Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.457737 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"955f80b9-933a-4583-92b7-f11c5ccd1bec","Type":"ContainerDied","Data":"71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0"} Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.457761 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.457771 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"955f80b9-933a-4583-92b7-f11c5ccd1bec","Type":"ContainerDied","Data":"94d2d6edd402cf7b05d9f3b0d4ccab70bf6318c6fde6dba9d27cd00b55e0f405"} Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.457785 4728 scope.go:117] "RemoveContainer" containerID="71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.489440 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 16 15:15:54 crc kubenswrapper[4728]: W1216 15:15:54.515668 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3da261d_5106_45a2_a6c7_d5314450c0af.slice/crio-90e2b97b61c9bfb044e45ce01ccf50eba38189e166f3324301441efeae0c2fae WatchSource:0}: Error finding container 90e2b97b61c9bfb044e45ce01ccf50eba38189e166f3324301441efeae0c2fae: Status 404 returned error can't find the container with id 90e2b97b61c9bfb044e45ce01ccf50eba38189e166f3324301441efeae0c2fae Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.519153 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.534076 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.535163 4728 scope.go:117] "RemoveContainer" containerID="54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.543805 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:54 crc kubenswrapper[4728]: E1216 15:15:54.544201 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerName="glance-log" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.544217 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerName="glance-log" Dec 16 15:15:54 crc kubenswrapper[4728]: E1216 15:15:54.544249 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerName="glance-httpd" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.544257 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerName="glance-httpd" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.544457 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerName="glance-httpd" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.544477 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="955f80b9-933a-4583-92b7-f11c5ccd1bec" containerName="glance-log" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.545608 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.550822 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.550991 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.552987 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.615971 4728 scope.go:117] "RemoveContainer" containerID="71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0" Dec 16 15:15:54 crc kubenswrapper[4728]: E1216 15:15:54.617388 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0\": container with ID starting with 71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0 not found: ID does not exist" containerID="71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.617450 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0"} err="failed to get container status \"71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0\": rpc error: code = NotFound desc = could not find container \"71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0\": container with ID starting with 71456e3c15bb8b6d3c388276434bb7ea4c4d604f85b0d84ce7a5c18f65aa3bc0 not found: ID does not exist" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.617480 4728 scope.go:117] "RemoveContainer" containerID="54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c" Dec 16 15:15:54 crc kubenswrapper[4728]: E1216 15:15:54.618028 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c\": container with ID starting with 54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c not found: ID does not exist" containerID="54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.618073 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c"} err="failed to get container status \"54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c\": rpc error: code = NotFound desc = could not find container \"54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c\": container with ID starting with 54de5f9900bfcdc23e1e7a56e6c98d73d888febe274e8746006c1b971865501c not found: ID does not exist" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.658846 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-scripts\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.658894 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-config-data\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.658931 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.658964 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/453173c9-63a1-457e-bf01-dd45f194a238-logs\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.659020 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.659075 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvdlj\" (UniqueName: \"kubernetes.io/projected/453173c9-63a1-457e-bf01-dd45f194a238-kube-api-access-qvdlj\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.659099 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/453173c9-63a1-457e-bf01-dd45f194a238-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.659436 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.767132 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/453173c9-63a1-457e-bf01-dd45f194a238-logs\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.767438 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.767493 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvdlj\" (UniqueName: \"kubernetes.io/projected/453173c9-63a1-457e-bf01-dd45f194a238-kube-api-access-qvdlj\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.767517 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/453173c9-63a1-457e-bf01-dd45f194a238-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.767540 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.767579 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/453173c9-63a1-457e-bf01-dd45f194a238-logs\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.767597 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-scripts\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.767624 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-config-data\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.767660 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.768278 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.768580 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/453173c9-63a1-457e-bf01-dd45f194a238-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.774798 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.777717 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-config-data\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.778112 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.786039 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvdlj\" (UniqueName: \"kubernetes.io/projected/453173c9-63a1-457e-bf01-dd45f194a238-kube-api-access-qvdlj\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.797177 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/453173c9-63a1-457e-bf01-dd45f194a238-scripts\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.797653 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"453173c9-63a1-457e-bf01-dd45f194a238\") " pod="openstack/glance-default-external-api-0" Dec 16 15:15:54 crc kubenswrapper[4728]: I1216 15:15:54.997471 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 16 15:15:55 crc kubenswrapper[4728]: I1216 15:15:55.468169 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c3da261d-5106-45a2-a6c7-d5314450c0af","Type":"ContainerStarted","Data":"9a38e3a8ec413b7f4ac40db1c089e0d96b44abe5ac073c8659f210658dcf8e14"} Dec 16 15:15:55 crc kubenswrapper[4728]: I1216 15:15:55.468500 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c3da261d-5106-45a2-a6c7-d5314450c0af","Type":"ContainerStarted","Data":"90e2b97b61c9bfb044e45ce01ccf50eba38189e166f3324301441efeae0c2fae"} Dec 16 15:15:55 crc kubenswrapper[4728]: I1216 15:15:55.470955 4728 generic.go:334] "Generic (PLEG): container finished" podID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerID="238866146b238760e52aa044cfbec6bb7112b3a6ebc78826658a15ac386aba0a" exitCode=0 Dec 16 15:15:55 crc kubenswrapper[4728]: I1216 15:15:55.471002 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"916a6b2e-6b7b-457e-b2a2-80d02edc2217","Type":"ContainerDied","Data":"238866146b238760e52aa044cfbec6bb7112b3a6ebc78826658a15ac386aba0a"} Dec 16 15:15:55 crc kubenswrapper[4728]: I1216 15:15:55.517459 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="955f80b9-933a-4583-92b7-f11c5ccd1bec" path="/var/lib/kubelet/pods/955f80b9-933a-4583-92b7-f11c5ccd1bec/volumes" Dec 16 15:15:56 crc kubenswrapper[4728]: I1216 15:15:56.485833 4728 generic.go:334] "Generic (PLEG): container finished" podID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerID="196597e6768245547f66f20e4f2e75c81d874be15b2ddc830f3b1e68902864f4" exitCode=0 Dec 16 15:15:56 crc kubenswrapper[4728]: I1216 15:15:56.486109 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb","Type":"ContainerDied","Data":"196597e6768245547f66f20e4f2e75c81d874be15b2ddc830f3b1e68902864f4"} Dec 16 15:15:58 crc kubenswrapper[4728]: I1216 15:15:58.033906 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 15:15:58 crc kubenswrapper[4728]: I1216 15:15:58.075312 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:15:58 crc kubenswrapper[4728]: I1216 15:15:58.142299 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:15:58 crc kubenswrapper[4728]: I1216 15:15:58.191901 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tv7cs"] Dec 16 15:15:58 crc kubenswrapper[4728]: I1216 15:15:58.192155 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" podUID="6964d7cb-c1bb-4296-ad17-d56280a0e8f0" containerName="dnsmasq-dns" containerID="cri-o://6fc491f73133d11fbaa46dd0217df12f4d855567ea1a4a94d59719d5897f8318" gracePeriod=10 Dec 16 15:15:58 crc kubenswrapper[4728]: I1216 15:15:58.511959 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="877e164c-b0b1-4a39-bd9c-1f2432ed8e36" containerName="cinder-scheduler" containerID="cri-o://89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b" gracePeriod=30 Dec 16 15:15:58 crc kubenswrapper[4728]: I1216 15:15:58.512600 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="877e164c-b0b1-4a39-bd9c-1f2432ed8e36" containerName="probe" containerID="cri-o://7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372" gracePeriod=30 Dec 16 15:15:59 crc kubenswrapper[4728]: I1216 15:15:59.549643 4728 generic.go:334] "Generic (PLEG): container finished" podID="6964d7cb-c1bb-4296-ad17-d56280a0e8f0" containerID="6fc491f73133d11fbaa46dd0217df12f4d855567ea1a4a94d59719d5897f8318" exitCode=0 Dec 16 15:15:59 crc kubenswrapper[4728]: I1216 15:15:59.552754 4728 generic.go:334] "Generic (PLEG): container finished" podID="877e164c-b0b1-4a39-bd9c-1f2432ed8e36" containerID="7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372" exitCode=0 Dec 16 15:15:59 crc kubenswrapper[4728]: I1216 15:15:59.559118 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" event={"ID":"6964d7cb-c1bb-4296-ad17-d56280a0e8f0","Type":"ContainerDied","Data":"6fc491f73133d11fbaa46dd0217df12f4d855567ea1a4a94d59719d5897f8318"} Dec 16 15:15:59 crc kubenswrapper[4728]: I1216 15:15:59.559167 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"877e164c-b0b1-4a39-bd9c-1f2432ed8e36","Type":"ContainerDied","Data":"7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372"} Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.467017 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.582055 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"916a6b2e-6b7b-457e-b2a2-80d02edc2217","Type":"ContainerDied","Data":"a82f5c2564c9cc5f92f2e539291846ae73ba3fc8979705dbe5d7044490e83272"} Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.582105 4728 scope.go:117] "RemoveContainer" containerID="238866146b238760e52aa044cfbec6bb7112b3a6ebc78826658a15ac386aba0a" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.582237 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.595994 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-internal-tls-certs\") pod \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.596133 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-httpd-run\") pod \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.596165 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.596240 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp6xc\" (UniqueName: \"kubernetes.io/projected/916a6b2e-6b7b-457e-b2a2-80d02edc2217-kube-api-access-vp6xc\") pod \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.596310 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-config-data\") pod \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.596331 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-scripts\") pod \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.596397 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-combined-ca-bundle\") pod \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.596533 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-logs\") pod \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\" (UID: \"916a6b2e-6b7b-457e-b2a2-80d02edc2217\") " Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.596714 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "916a6b2e-6b7b-457e-b2a2-80d02edc2217" (UID: "916a6b2e-6b7b-457e-b2a2-80d02edc2217"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.597433 4728 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.599508 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-logs" (OuterVolumeSpecName: "logs") pod "916a6b2e-6b7b-457e-b2a2-80d02edc2217" (UID: "916a6b2e-6b7b-457e-b2a2-80d02edc2217"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.603560 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-scripts" (OuterVolumeSpecName: "scripts") pod "916a6b2e-6b7b-457e-b2a2-80d02edc2217" (UID: "916a6b2e-6b7b-457e-b2a2-80d02edc2217"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.604050 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/916a6b2e-6b7b-457e-b2a2-80d02edc2217-kube-api-access-vp6xc" (OuterVolumeSpecName: "kube-api-access-vp6xc") pod "916a6b2e-6b7b-457e-b2a2-80d02edc2217" (UID: "916a6b2e-6b7b-457e-b2a2-80d02edc2217"). InnerVolumeSpecName "kube-api-access-vp6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.617876 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "916a6b2e-6b7b-457e-b2a2-80d02edc2217" (UID: "916a6b2e-6b7b-457e-b2a2-80d02edc2217"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.657874 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "916a6b2e-6b7b-457e-b2a2-80d02edc2217" (UID: "916a6b2e-6b7b-457e-b2a2-80d02edc2217"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.686475 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "916a6b2e-6b7b-457e-b2a2-80d02edc2217" (UID: "916a6b2e-6b7b-457e-b2a2-80d02edc2217"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.699050 4728 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.699114 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.699128 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp6xc\" (UniqueName: \"kubernetes.io/projected/916a6b2e-6b7b-457e-b2a2-80d02edc2217-kube-api-access-vp6xc\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.699140 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.706607 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-config-data" (OuterVolumeSpecName: "config-data") pod "916a6b2e-6b7b-457e-b2a2-80d02edc2217" (UID: "916a6b2e-6b7b-457e-b2a2-80d02edc2217"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.712621 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.712915 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916a6b2e-6b7b-457e-b2a2-80d02edc2217-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.721270 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.814774 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.814801 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916a6b2e-6b7b-457e-b2a2-80d02edc2217-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.916842 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.924699 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.937033 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:16:00 crc kubenswrapper[4728]: E1216 15:16:00.937801 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerName="glance-log" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.937825 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerName="glance-log" Dec 16 15:16:00 crc kubenswrapper[4728]: E1216 15:16:00.937848 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerName="glance-httpd" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.937856 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerName="glance-httpd" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.938069 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerName="glance-log" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.938096 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerName="glance-httpd" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.939322 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.947280 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.947610 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.966468 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.985955 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:00 crc kubenswrapper[4728]: I1216 15:16:00.989745 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.019953 4728 scope.go:117] "RemoveContainer" containerID="c2f27f86e8a584c665cd7d6538155dc1abeea1e78c5d29a69d32d58406d404cd" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.120117 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-combined-ca-bundle\") pod \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.120497 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-svc\") pod \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.120538 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-scripts\") pod \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.120559 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-nb\") pod \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.121401 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmv9q\" (UniqueName: \"kubernetes.io/projected/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-kube-api-access-nmv9q\") pod \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.121524 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-config-data\") pod \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.121547 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-config\") pod \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.121606 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-swift-storage-0\") pod \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.121630 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-run-httpd\") pod \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.121728 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ps8t\" (UniqueName: \"kubernetes.io/projected/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-kube-api-access-7ps8t\") pod \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.121755 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-sg-core-conf-yaml\") pod \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.121789 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-log-httpd\") pod \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\" (UID: \"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.121808 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-sb\") pod \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\" (UID: \"6964d7cb-c1bb-4296-ad17-d56280a0e8f0\") " Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.121993 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh7q4\" (UniqueName: \"kubernetes.io/projected/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-kube-api-access-hh7q4\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.122023 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-config-data\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.122053 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-logs\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.122078 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-scripts\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.122111 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.122183 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.122205 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.122233 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.125107 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-scripts" (OuterVolumeSpecName: "scripts") pod "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" (UID: "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.125739 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" (UID: "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.126054 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" (UID: "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.133900 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-kube-api-access-nmv9q" (OuterVolumeSpecName: "kube-api-access-nmv9q") pod "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" (UID: "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb"). InnerVolumeSpecName "kube-api-access-nmv9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.144006 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-kube-api-access-7ps8t" (OuterVolumeSpecName: "kube-api-access-7ps8t") pod "6964d7cb-c1bb-4296-ad17-d56280a0e8f0" (UID: "6964d7cb-c1bb-4296-ad17-d56280a0e8f0"). InnerVolumeSpecName "kube-api-access-7ps8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.167637 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" (UID: "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.197057 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6964d7cb-c1bb-4296-ad17-d56280a0e8f0" (UID: "6964d7cb-c1bb-4296-ad17-d56280a0e8f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.208330 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6964d7cb-c1bb-4296-ad17-d56280a0e8f0" (UID: "6964d7cb-c1bb-4296-ad17-d56280a0e8f0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.213758 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6964d7cb-c1bb-4296-ad17-d56280a0e8f0" (UID: "6964d7cb-c1bb-4296-ad17-d56280a0e8f0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224410 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224478 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224511 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224545 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh7q4\" (UniqueName: \"kubernetes.io/projected/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-kube-api-access-hh7q4\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224573 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-config-data\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224599 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-logs\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224621 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-scripts\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224655 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224696 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224723 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224736 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224745 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224754 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmv9q\" (UniqueName: \"kubernetes.io/projected/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-kube-api-access-nmv9q\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224766 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224774 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224782 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ps8t\" (UniqueName: \"kubernetes.io/projected/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-kube-api-access-7ps8t\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224790 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.224797 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.225046 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.225688 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-logs\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.231542 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-scripts\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.232294 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-config-data\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.238453 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.240853 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.245549 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh7q4\" (UniqueName: \"kubernetes.io/projected/db21c1bc-6a08-4948-8cea-5d5ee3ecd223-kube-api-access-hh7q4\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.258380 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-config" (OuterVolumeSpecName: "config") pod "6964d7cb-c1bb-4296-ad17-d56280a0e8f0" (UID: "6964d7cb-c1bb-4296-ad17-d56280a0e8f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.271034 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6964d7cb-c1bb-4296-ad17-d56280a0e8f0" (UID: "6964d7cb-c1bb-4296-ad17-d56280a0e8f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.273409 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"db21c1bc-6a08-4948-8cea-5d5ee3ecd223\") " pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.292857 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" (UID: "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.324124 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-config-data" (OuterVolumeSpecName: "config-data") pod "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" (UID: "dc177dd3-fc63-4d9b-a4fc-07e58a49bafb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.326102 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.326136 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.326147 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6964d7cb-c1bb-4296-ad17-d56280a0e8f0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.326156 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.523879 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" path="/var/lib/kubelet/pods/916a6b2e-6b7b-457e-b2a2-80d02edc2217/volumes" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.526453 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.571536 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.623377 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bh899" event={"ID":"3d185e68-d66c-438b-b4c2-bde356e4313e","Type":"ContainerStarted","Data":"a765450ffe387eb14fc11c7120b09befd31a586d0ce0af163b0ce45c794e0318"} Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.631770 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" event={"ID":"6964d7cb-c1bb-4296-ad17-d56280a0e8f0","Type":"ContainerDied","Data":"870a6d7e4e4dbe748a9091d99d14f68eacbda0ce459e2241af0044f0d5f4a6e3"} Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.631817 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tv7cs" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.631829 4728 scope.go:117] "RemoveContainer" containerID="6fc491f73133d11fbaa46dd0217df12f4d855567ea1a4a94d59719d5897f8318" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.632863 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"453173c9-63a1-457e-bf01-dd45f194a238","Type":"ContainerStarted","Data":"9bed24b7b50f0e5b79df3291a5364be810787898ef43a2732a27dc506c57184d"} Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.637203 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc177dd3-fc63-4d9b-a4fc-07e58a49bafb","Type":"ContainerDied","Data":"246bcbf1c1a0a4e3509e8170b0f78e99d7e9888cc2843f2e4e0c942d7b021dc0"} Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.637666 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.641961 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-bh899" podStartSLOduration=1.8189626620000001 podStartE2EDuration="14.641947738s" podCreationTimestamp="2025-12-16 15:15:47 +0000 UTC" firstStartedPulling="2025-12-16 15:15:48.27577056 +0000 UTC m=+1129.115949544" lastFinishedPulling="2025-12-16 15:16:01.098755636 +0000 UTC m=+1141.938934620" observedRunningTime="2025-12-16 15:16:01.641075903 +0000 UTC m=+1142.481254887" watchObservedRunningTime="2025-12-16 15:16:01.641947738 +0000 UTC m=+1142.482126722" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.707649 4728 scope.go:117] "RemoveContainer" containerID="bec4747c7d00dee4e3413fdb56fb8612b07fcff3c1529bc844705155109166a3" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.718756 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.734969 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.764100 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tv7cs"] Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.765181 4728 scope.go:117] "RemoveContainer" containerID="65be674c5d4ece6b7da770926f2956f0f42c3bcb6bdee62bf3d217479d36fe55" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.776494 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tv7cs"] Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.790920 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:01 crc kubenswrapper[4728]: E1216 15:16:01.791499 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="ceilometer-notification-agent" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.791522 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="ceilometer-notification-agent" Dec 16 15:16:01 crc kubenswrapper[4728]: E1216 15:16:01.791546 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6964d7cb-c1bb-4296-ad17-d56280a0e8f0" containerName="dnsmasq-dns" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.791555 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6964d7cb-c1bb-4296-ad17-d56280a0e8f0" containerName="dnsmasq-dns" Dec 16 15:16:01 crc kubenswrapper[4728]: E1216 15:16:01.791567 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="proxy-httpd" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.791575 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="proxy-httpd" Dec 16 15:16:01 crc kubenswrapper[4728]: E1216 15:16:01.791590 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="sg-core" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.791597 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="sg-core" Dec 16 15:16:01 crc kubenswrapper[4728]: E1216 15:16:01.791609 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="ceilometer-central-agent" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.791617 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="ceilometer-central-agent" Dec 16 15:16:01 crc kubenswrapper[4728]: E1216 15:16:01.791629 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6964d7cb-c1bb-4296-ad17-d56280a0e8f0" containerName="init" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.791636 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6964d7cb-c1bb-4296-ad17-d56280a0e8f0" containerName="init" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.791892 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="ceilometer-central-agent" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.791912 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="sg-core" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.791921 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="proxy-httpd" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.791940 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6964d7cb-c1bb-4296-ad17-d56280a0e8f0" containerName="dnsmasq-dns" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.791956 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" containerName="ceilometer-notification-agent" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.794512 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.798862 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.799284 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.806655 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.822560 4728 scope.go:117] "RemoveContainer" containerID="55bb5fa9c2353c08711c33c17e5a2614662a29bd01c2f32e4189126d76d69fb4" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.848053 4728 scope.go:117] "RemoveContainer" containerID="39cc123c9ab5b1c3549fd536272a73a942013a815832bee0783d90955fec8d40" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.874970 4728 scope.go:117] "RemoveContainer" containerID="196597e6768245547f66f20e4f2e75c81d874be15b2ddc830f3b1e68902864f4" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.938078 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.938201 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-run-httpd\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.938237 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.938288 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-log-httpd\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.938349 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg7qr\" (UniqueName: \"kubernetes.io/projected/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-kube-api-access-zg7qr\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.938437 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-scripts\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:01 crc kubenswrapper[4728]: I1216 15:16:01.938485 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-config-data\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.040541 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-config-data\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.040602 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.040655 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-run-httpd\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.040671 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.040693 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-log-httpd\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.040748 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg7qr\" (UniqueName: \"kubernetes.io/projected/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-kube-api-access-zg7qr\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.040775 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-scripts\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.042023 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-run-httpd\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.042349 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-log-httpd\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.046310 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-scripts\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.046846 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.047152 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.047780 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-config-data\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.063384 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg7qr\" (UniqueName: \"kubernetes.io/projected/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-kube-api-access-zg7qr\") pod \"ceilometer-0\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.119483 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.197591 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.588181 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.656186 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"453173c9-63a1-457e-bf01-dd45f194a238","Type":"ContainerStarted","Data":"4754be2f50eead9a45667aae6671a4504d5dd8bf665880e7821bc85d05e3c71e"} Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.662166 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"db21c1bc-6a08-4948-8cea-5d5ee3ecd223","Type":"ContainerStarted","Data":"b03af912fa1b8cb31be623622beac0c382c1d00cd98bdce5f41f05deb82a4928"} Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.664267 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c3da261d-5106-45a2-a6c7-d5314450c0af","Type":"ContainerStarted","Data":"138008b80e29b5235831682e54a7410473cd166ab64c55c095e8b50c3f05640c"} Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.665187 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.667543 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cbd94c25-cabc-42f7-9f8a-fb3c070622d3","Type":"ContainerStarted","Data":"b55d083a4c572d17a5b136a33f52c7afd51555672d2a3232d985248f599577b4"} Dec 16 15:16:02 crc kubenswrapper[4728]: I1216 15:16:02.694667 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=9.694650984 podStartE2EDuration="9.694650984s" podCreationTimestamp="2025-12-16 15:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:02.682748311 +0000 UTC m=+1143.522927295" watchObservedRunningTime="2025-12-16 15:16:02.694650984 +0000 UTC m=+1143.534829968" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.233475 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.362864 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data\") pod \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.363474 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swv6h\" (UniqueName: \"kubernetes.io/projected/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-kube-api-access-swv6h\") pod \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.363611 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-combined-ca-bundle\") pod \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.363689 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-scripts\") pod \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.363747 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-etc-machine-id\") pod \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.363773 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data-custom\") pod \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\" (UID: \"877e164c-b0b1-4a39-bd9c-1f2432ed8e36\") " Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.363795 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "877e164c-b0b1-4a39-bd9c-1f2432ed8e36" (UID: "877e164c-b0b1-4a39-bd9c-1f2432ed8e36"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.364251 4728 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.367081 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "877e164c-b0b1-4a39-bd9c-1f2432ed8e36" (UID: "877e164c-b0b1-4a39-bd9c-1f2432ed8e36"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.367544 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-scripts" (OuterVolumeSpecName: "scripts") pod "877e164c-b0b1-4a39-bd9c-1f2432ed8e36" (UID: "877e164c-b0b1-4a39-bd9c-1f2432ed8e36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.368186 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-kube-api-access-swv6h" (OuterVolumeSpecName: "kube-api-access-swv6h") pod "877e164c-b0b1-4a39-bd9c-1f2432ed8e36" (UID: "877e164c-b0b1-4a39-bd9c-1f2432ed8e36"). InnerVolumeSpecName "kube-api-access-swv6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.423005 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "877e164c-b0b1-4a39-bd9c-1f2432ed8e36" (UID: "877e164c-b0b1-4a39-bd9c-1f2432ed8e36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.466011 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swv6h\" (UniqueName: \"kubernetes.io/projected/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-kube-api-access-swv6h\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.466046 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.466057 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.466067 4728 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.482616 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data" (OuterVolumeSpecName: "config-data") pod "877e164c-b0b1-4a39-bd9c-1f2432ed8e36" (UID: "877e164c-b0b1-4a39-bd9c-1f2432ed8e36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.517422 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6964d7cb-c1bb-4296-ad17-d56280a0e8f0" path="/var/lib/kubelet/pods/6964d7cb-c1bb-4296-ad17-d56280a0e8f0/volumes" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.518222 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc177dd3-fc63-4d9b-a4fc-07e58a49bafb" path="/var/lib/kubelet/pods/dc177dd3-fc63-4d9b-a4fc-07e58a49bafb/volumes" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.567782 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877e164c-b0b1-4a39-bd9c-1f2432ed8e36-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.690492 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"453173c9-63a1-457e-bf01-dd45f194a238","Type":"ContainerStarted","Data":"eab82f64030dd7754e57f5a2ebd9eb740542feeb2cd0433b0ae3c294cdeb5e61"} Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.693582 4728 generic.go:334] "Generic (PLEG): container finished" podID="877e164c-b0b1-4a39-bd9c-1f2432ed8e36" containerID="89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b" exitCode=0 Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.693760 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.693776 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"877e164c-b0b1-4a39-bd9c-1f2432ed8e36","Type":"ContainerDied","Data":"89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b"} Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.693833 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"877e164c-b0b1-4a39-bd9c-1f2432ed8e36","Type":"ContainerDied","Data":"0f23b2dea167ea7ec9b6cd2b0729fddd6d53e57c86205dc08a1e8047865b244b"} Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.693857 4728 scope.go:117] "RemoveContainer" containerID="7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.697884 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"db21c1bc-6a08-4948-8cea-5d5ee3ecd223","Type":"ContainerStarted","Data":"5ac32164773cc9445ad46cd54a773c922b69441b33aae23abe79c597b27e1fff"} Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.720330 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.720308487 podStartE2EDuration="9.720308487s" podCreationTimestamp="2025-12-16 15:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:03.718422026 +0000 UTC m=+1144.558601020" watchObservedRunningTime="2025-12-16 15:16:03.720308487 +0000 UTC m=+1144.560487471" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.725909 4728 scope.go:117] "RemoveContainer" containerID="89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.741560 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.762651 4728 scope.go:117] "RemoveContainer" containerID="7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372" Dec 16 15:16:03 crc kubenswrapper[4728]: E1216 15:16:03.763311 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372\": container with ID starting with 7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372 not found: ID does not exist" containerID="7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.763364 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372"} err="failed to get container status \"7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372\": rpc error: code = NotFound desc = could not find container \"7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372\": container with ID starting with 7c518ffb3b14e679783b896f096cf2ce285a3fd654eb9aa3bd9951f138762372 not found: ID does not exist" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.763403 4728 scope.go:117] "RemoveContainer" containerID="89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b" Dec 16 15:16:03 crc kubenswrapper[4728]: E1216 15:16:03.763742 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b\": container with ID starting with 89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b not found: ID does not exist" containerID="89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.763779 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b"} err="failed to get container status \"89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b\": rpc error: code = NotFound desc = could not find container \"89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b\": container with ID starting with 89e4afbfa8414a0838924204ee4b70fa314c4bab229dae6855d05fb62668bd1b not found: ID does not exist" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.772812 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.779655 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:03 crc kubenswrapper[4728]: E1216 15:16:03.780092 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877e164c-b0b1-4a39-bd9c-1f2432ed8e36" containerName="cinder-scheduler" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.780109 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="877e164c-b0b1-4a39-bd9c-1f2432ed8e36" containerName="cinder-scheduler" Dec 16 15:16:03 crc kubenswrapper[4728]: E1216 15:16:03.780122 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877e164c-b0b1-4a39-bd9c-1f2432ed8e36" containerName="probe" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.780129 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="877e164c-b0b1-4a39-bd9c-1f2432ed8e36" containerName="probe" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.780318 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="877e164c-b0b1-4a39-bd9c-1f2432ed8e36" containerName="cinder-scheduler" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.780335 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="877e164c-b0b1-4a39-bd9c-1f2432ed8e36" containerName="probe" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.781277 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.786903 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.787649 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.873159 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jl6t\" (UniqueName: \"kubernetes.io/projected/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-kube-api-access-4jl6t\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.873219 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-scripts\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.873255 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-config-data\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.873280 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.873338 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.873446 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.975127 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jl6t\" (UniqueName: \"kubernetes.io/projected/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-kube-api-access-4jl6t\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.975180 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-scripts\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.975205 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-config-data\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.975225 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.975248 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.975308 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.975351 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.979332 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-scripts\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.980126 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.980249 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-config-data\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.980651 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:03 crc kubenswrapper[4728]: I1216 15:16:03.990163 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jl6t\" (UniqueName: \"kubernetes.io/projected/a95d0c5b-fcce-46ba-bfae-1b25bf1d10af-kube-api-access-4jl6t\") pod \"cinder-scheduler-0\" (UID: \"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af\") " pod="openstack/cinder-scheduler-0" Dec 16 15:16:04 crc kubenswrapper[4728]: I1216 15:16:04.104171 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 16 15:16:04 crc kubenswrapper[4728]: I1216 15:16:04.632834 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 16 15:16:04 crc kubenswrapper[4728]: I1216 15:16:04.725220 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af","Type":"ContainerStarted","Data":"792433e959edcd0e2265021a10f5c82d54277089c9d57048409cc878c7ad749f"} Dec 16 15:16:04 crc kubenswrapper[4728]: I1216 15:16:04.727570 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"db21c1bc-6a08-4948-8cea-5d5ee3ecd223","Type":"ContainerStarted","Data":"3ff2368788da1a487dcd2acef18689d4abb6567e602aabda3649f76e0b005651"} Dec 16 15:16:04 crc kubenswrapper[4728]: I1216 15:16:04.733381 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cbd94c25-cabc-42f7-9f8a-fb3c070622d3","Type":"ContainerStarted","Data":"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f"} Dec 16 15:16:04 crc kubenswrapper[4728]: I1216 15:16:04.758120 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.758102029 podStartE2EDuration="4.758102029s" podCreationTimestamp="2025-12-16 15:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:04.75042888 +0000 UTC m=+1145.590607864" watchObservedRunningTime="2025-12-16 15:16:04.758102029 +0000 UTC m=+1145.598281013" Dec 16 15:16:04 crc kubenswrapper[4728]: I1216 15:16:04.998656 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 15:16:04 crc kubenswrapper[4728]: I1216 15:16:04.998737 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 16 15:16:05 crc kubenswrapper[4728]: I1216 15:16:05.038968 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 15:16:05 crc kubenswrapper[4728]: I1216 15:16:05.051440 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 16 15:16:05 crc kubenswrapper[4728]: I1216 15:16:05.519055 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877e164c-b0b1-4a39-bd9c-1f2432ed8e36" path="/var/lib/kubelet/pods/877e164c-b0b1-4a39-bd9c-1f2432ed8e36/volumes" Dec 16 15:16:05 crc kubenswrapper[4728]: I1216 15:16:05.750553 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cbd94c25-cabc-42f7-9f8a-fb3c070622d3","Type":"ContainerStarted","Data":"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803"} Dec 16 15:16:05 crc kubenswrapper[4728]: I1216 15:16:05.753053 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cbd94c25-cabc-42f7-9f8a-fb3c070622d3","Type":"ContainerStarted","Data":"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b"} Dec 16 15:16:05 crc kubenswrapper[4728]: I1216 15:16:05.760809 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af","Type":"ContainerStarted","Data":"bd355e100dfd802842ec67926d2a7067a6991b85408b3b273ae47ce9e9ab23b6"} Dec 16 15:16:05 crc kubenswrapper[4728]: I1216 15:16:05.761945 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 15:16:05 crc kubenswrapper[4728]: I1216 15:16:05.762129 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 16 15:16:06 crc kubenswrapper[4728]: I1216 15:16:06.772849 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a95d0c5b-fcce-46ba-bfae-1b25bf1d10af","Type":"ContainerStarted","Data":"3dccae3369c64afcbaa7db3ed0ebc84fbdd194aeedda532aa16409176171e26e"} Dec 16 15:16:06 crc kubenswrapper[4728]: I1216 15:16:06.790232 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.7902160719999998 podStartE2EDuration="3.790216072s" podCreationTimestamp="2025-12-16 15:16:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:06.788807084 +0000 UTC m=+1147.628986068" watchObservedRunningTime="2025-12-16 15:16:06.790216072 +0000 UTC m=+1147.630395056" Dec 16 15:16:09 crc kubenswrapper[4728]: I1216 15:16:09.104626 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 16 15:16:09 crc kubenswrapper[4728]: I1216 15:16:09.799666 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cbd94c25-cabc-42f7-9f8a-fb3c070622d3","Type":"ContainerStarted","Data":"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926"} Dec 16 15:16:09 crc kubenswrapper[4728]: I1216 15:16:09.800187 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:16:09 crc kubenswrapper[4728]: I1216 15:16:09.826358 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.602218588 podStartE2EDuration="8.826337089s" podCreationTimestamp="2025-12-16 15:16:01 +0000 UTC" firstStartedPulling="2025-12-16 15:16:02.598244715 +0000 UTC m=+1143.438423699" lastFinishedPulling="2025-12-16 15:16:08.822363206 +0000 UTC m=+1149.662542200" observedRunningTime="2025-12-16 15:16:09.817312374 +0000 UTC m=+1150.657491378" watchObservedRunningTime="2025-12-16 15:16:09.826337089 +0000 UTC m=+1150.666516073" Dec 16 15:16:10 crc kubenswrapper[4728]: I1216 15:16:10.098341 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 15:16:10 crc kubenswrapper[4728]: I1216 15:16:10.100695 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 16 15:16:10 crc kubenswrapper[4728]: I1216 15:16:10.791507 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:11 crc kubenswrapper[4728]: I1216 15:16:11.007106 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 16 15:16:11 crc kubenswrapper[4728]: I1216 15:16:11.572145 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:11 crc kubenswrapper[4728]: I1216 15:16:11.573299 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:11 crc kubenswrapper[4728]: I1216 15:16:11.604552 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:11 crc kubenswrapper[4728]: I1216 15:16:11.628476 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:11 crc kubenswrapper[4728]: I1216 15:16:11.814722 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="ceilometer-central-agent" containerID="cri-o://66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f" gracePeriod=30 Dec 16 15:16:11 crc kubenswrapper[4728]: I1216 15:16:11.815239 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="ceilometer-notification-agent" containerID="cri-o://7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b" gracePeriod=30 Dec 16 15:16:11 crc kubenswrapper[4728]: I1216 15:16:11.815311 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="proxy-httpd" containerID="cri-o://9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926" gracePeriod=30 Dec 16 15:16:11 crc kubenswrapper[4728]: I1216 15:16:11.815361 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="sg-core" containerID="cri-o://fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803" gracePeriod=30 Dec 16 15:16:11 crc kubenswrapper[4728]: I1216 15:16:11.816212 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:11 crc kubenswrapper[4728]: I1216 15:16:11.816239 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.669565 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.827687 4728 generic.go:334] "Generic (PLEG): container finished" podID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerID="9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926" exitCode=0 Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.828427 4728 generic.go:334] "Generic (PLEG): container finished" podID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerID="fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803" exitCode=2 Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.828527 4728 generic.go:334] "Generic (PLEG): container finished" podID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerID="7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b" exitCode=0 Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.828605 4728 generic.go:334] "Generic (PLEG): container finished" podID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerID="66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f" exitCode=0 Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.828383 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.828360 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cbd94c25-cabc-42f7-9f8a-fb3c070622d3","Type":"ContainerDied","Data":"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926"} Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.828847 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cbd94c25-cabc-42f7-9f8a-fb3c070622d3","Type":"ContainerDied","Data":"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803"} Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.828865 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cbd94c25-cabc-42f7-9f8a-fb3c070622d3","Type":"ContainerDied","Data":"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b"} Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.828879 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cbd94c25-cabc-42f7-9f8a-fb3c070622d3","Type":"ContainerDied","Data":"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f"} Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.828893 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cbd94c25-cabc-42f7-9f8a-fb3c070622d3","Type":"ContainerDied","Data":"b55d083a4c572d17a5b136a33f52c7afd51555672d2a3232d985248f599577b4"} Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.828913 4728 scope.go:117] "RemoveContainer" containerID="9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.838721 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg7qr\" (UniqueName: \"kubernetes.io/projected/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-kube-api-access-zg7qr\") pod \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.839724 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-sg-core-conf-yaml\") pod \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.839852 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-scripts\") pod \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.839968 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-log-httpd\") pod \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.840129 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-config-data\") pod \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.840295 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-run-httpd\") pod \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.840440 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-combined-ca-bundle\") pod \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\" (UID: \"cbd94c25-cabc-42f7-9f8a-fb3c070622d3\") " Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.840481 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cbd94c25-cabc-42f7-9f8a-fb3c070622d3" (UID: "cbd94c25-cabc-42f7-9f8a-fb3c070622d3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.840609 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cbd94c25-cabc-42f7-9f8a-fb3c070622d3" (UID: "cbd94c25-cabc-42f7-9f8a-fb3c070622d3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.841039 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.841106 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.844304 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-kube-api-access-zg7qr" (OuterVolumeSpecName: "kube-api-access-zg7qr") pod "cbd94c25-cabc-42f7-9f8a-fb3c070622d3" (UID: "cbd94c25-cabc-42f7-9f8a-fb3c070622d3"). InnerVolumeSpecName "kube-api-access-zg7qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.845906 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-scripts" (OuterVolumeSpecName: "scripts") pod "cbd94c25-cabc-42f7-9f8a-fb3c070622d3" (UID: "cbd94c25-cabc-42f7-9f8a-fb3c070622d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.856398 4728 scope.go:117] "RemoveContainer" containerID="fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.870096 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cbd94c25-cabc-42f7-9f8a-fb3c070622d3" (UID: "cbd94c25-cabc-42f7-9f8a-fb3c070622d3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.884189 4728 scope.go:117] "RemoveContainer" containerID="7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.904934 4728 scope.go:117] "RemoveContainer" containerID="66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.924879 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbd94c25-cabc-42f7-9f8a-fb3c070622d3" (UID: "cbd94c25-cabc-42f7-9f8a-fb3c070622d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.930140 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-config-data" (OuterVolumeSpecName: "config-data") pod "cbd94c25-cabc-42f7-9f8a-fb3c070622d3" (UID: "cbd94c25-cabc-42f7-9f8a-fb3c070622d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.934014 4728 scope.go:117] "RemoveContainer" containerID="9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926" Dec 16 15:16:12 crc kubenswrapper[4728]: E1216 15:16:12.934534 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926\": container with ID starting with 9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926 not found: ID does not exist" containerID="9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.934582 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926"} err="failed to get container status \"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926\": rpc error: code = NotFound desc = could not find container \"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926\": container with ID starting with 9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926 not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.934611 4728 scope.go:117] "RemoveContainer" containerID="fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803" Dec 16 15:16:12 crc kubenswrapper[4728]: E1216 15:16:12.934858 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803\": container with ID starting with fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803 not found: ID does not exist" containerID="fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.934878 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803"} err="failed to get container status \"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803\": rpc error: code = NotFound desc = could not find container \"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803\": container with ID starting with fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803 not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.934891 4728 scope.go:117] "RemoveContainer" containerID="7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b" Dec 16 15:16:12 crc kubenswrapper[4728]: E1216 15:16:12.935160 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b\": container with ID starting with 7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b not found: ID does not exist" containerID="7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.935212 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b"} err="failed to get container status \"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b\": rpc error: code = NotFound desc = could not find container \"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b\": container with ID starting with 7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.935226 4728 scope.go:117] "RemoveContainer" containerID="66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f" Dec 16 15:16:12 crc kubenswrapper[4728]: E1216 15:16:12.935528 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f\": container with ID starting with 66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f not found: ID does not exist" containerID="66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.935565 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f"} err="failed to get container status \"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f\": rpc error: code = NotFound desc = could not find container \"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f\": container with ID starting with 66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.935596 4728 scope.go:117] "RemoveContainer" containerID="9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.951782 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926"} err="failed to get container status \"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926\": rpc error: code = NotFound desc = could not find container \"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926\": container with ID starting with 9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926 not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.951827 4728 scope.go:117] "RemoveContainer" containerID="fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.953083 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.953110 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg7qr\" (UniqueName: \"kubernetes.io/projected/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-kube-api-access-zg7qr\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.953122 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.953134 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.953146 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd94c25-cabc-42f7-9f8a-fb3c070622d3-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.956852 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803"} err="failed to get container status \"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803\": rpc error: code = NotFound desc = could not find container \"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803\": container with ID starting with fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803 not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.956964 4728 scope.go:117] "RemoveContainer" containerID="7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.957463 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b"} err="failed to get container status \"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b\": rpc error: code = NotFound desc = could not find container \"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b\": container with ID starting with 7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.957494 4728 scope.go:117] "RemoveContainer" containerID="66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.957758 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f"} err="failed to get container status \"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f\": rpc error: code = NotFound desc = could not find container \"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f\": container with ID starting with 66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.957784 4728 scope.go:117] "RemoveContainer" containerID="9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.958285 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926"} err="failed to get container status \"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926\": rpc error: code = NotFound desc = could not find container \"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926\": container with ID starting with 9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926 not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.958311 4728 scope.go:117] "RemoveContainer" containerID="fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.958620 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803"} err="failed to get container status \"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803\": rpc error: code = NotFound desc = could not find container \"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803\": container with ID starting with fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803 not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.958644 4728 scope.go:117] "RemoveContainer" containerID="7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.959757 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b"} err="failed to get container status \"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b\": rpc error: code = NotFound desc = could not find container \"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b\": container with ID starting with 7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.959859 4728 scope.go:117] "RemoveContainer" containerID="66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.960240 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f"} err="failed to get container status \"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f\": rpc error: code = NotFound desc = could not find container \"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f\": container with ID starting with 66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.960287 4728 scope.go:117] "RemoveContainer" containerID="9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.960662 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926"} err="failed to get container status \"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926\": rpc error: code = NotFound desc = could not find container \"9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926\": container with ID starting with 9cf548acb1bc62fa3bc03a19418fb84371d034678ce3de90cbaafec39a6b8926 not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.960687 4728 scope.go:117] "RemoveContainer" containerID="fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.960950 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803"} err="failed to get container status \"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803\": rpc error: code = NotFound desc = could not find container \"fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803\": container with ID starting with fb5459b5578b30a63af620ea13ddcfa0053cced9506a3dbcf1c5377a1cf70803 not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.960979 4728 scope.go:117] "RemoveContainer" containerID="7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.961266 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b"} err="failed to get container status \"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b\": rpc error: code = NotFound desc = could not find container \"7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b\": container with ID starting with 7aa9a53df90fed057c593695c0ba8cd2228ce53a24bb1bd1309969ce1672e49b not found: ID does not exist" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.961291 4728 scope.go:117] "RemoveContainer" containerID="66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f" Dec 16 15:16:12 crc kubenswrapper[4728]: I1216 15:16:12.961583 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f"} err="failed to get container status \"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f\": rpc error: code = NotFound desc = could not find container \"66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f\": container with ID starting with 66588665e2ac37d04962789604736d9f2d7322fc72ca6272092bf96f6b92064f not found: ID does not exist" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.173156 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.182143 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.199067 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:13 crc kubenswrapper[4728]: E1216 15:16:13.199417 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="sg-core" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.199433 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="sg-core" Dec 16 15:16:13 crc kubenswrapper[4728]: E1216 15:16:13.199449 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="ceilometer-notification-agent" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.199458 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="ceilometer-notification-agent" Dec 16 15:16:13 crc kubenswrapper[4728]: E1216 15:16:13.199478 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="ceilometer-central-agent" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.199526 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="ceilometer-central-agent" Dec 16 15:16:13 crc kubenswrapper[4728]: E1216 15:16:13.199539 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="proxy-httpd" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.199544 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="proxy-httpd" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.199698 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="ceilometer-central-agent" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.199717 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="ceilometer-notification-agent" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.199731 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="sg-core" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.199742 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" containerName="proxy-httpd" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.202469 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.207761 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.207800 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.216302 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.366400 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.366524 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-config-data\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.366673 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.366750 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-scripts\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.366830 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjjbt\" (UniqueName: \"kubernetes.io/projected/8a410ae7-1999-4ce1-a2bb-94158e1b917e-kube-api-access-xjjbt\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.366858 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-run-httpd\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.366932 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-log-httpd\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.469452 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-log-httpd\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.469602 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.469638 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-config-data\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.469744 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.469914 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-log-httpd\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.470344 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-scripts\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.470461 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjjbt\" (UniqueName: \"kubernetes.io/projected/8a410ae7-1999-4ce1-a2bb-94158e1b917e-kube-api-access-xjjbt\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.470499 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-run-httpd\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.470781 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-run-httpd\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.474431 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-scripts\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.475593 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-config-data\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.477767 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.496399 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.500456 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjjbt\" (UniqueName: \"kubernetes.io/projected/8a410ae7-1999-4ce1-a2bb-94158e1b917e-kube-api-access-xjjbt\") pod \"ceilometer-0\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.523155 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd94c25-cabc-42f7-9f8a-fb3c070622d3" path="/var/lib/kubelet/pods/cbd94c25-cabc-42f7-9f8a-fb3c070622d3/volumes" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.529036 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.779326 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.794250 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 16 15:16:13 crc kubenswrapper[4728]: W1216 15:16:13.963064 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a410ae7_1999_4ce1_a2bb_94158e1b917e.slice/crio-52b7ab2ca84ae77fc776add517dba339c064ef74a55948372e3fe66584cded91 WatchSource:0}: Error finding container 52b7ab2ca84ae77fc776add517dba339c064ef74a55948372e3fe66584cded91: Status 404 returned error can't find the container with id 52b7ab2ca84ae77fc776add517dba339c064ef74a55948372e3fe66584cded91 Dec 16 15:16:13 crc kubenswrapper[4728]: I1216 15:16:13.965507 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:14 crc kubenswrapper[4728]: I1216 15:16:14.362020 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 16 15:16:14 crc kubenswrapper[4728]: I1216 15:16:14.849284 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a410ae7-1999-4ce1-a2bb-94158e1b917e","Type":"ContainerStarted","Data":"52b7ab2ca84ae77fc776add517dba339c064ef74a55948372e3fe66584cded91"} Dec 16 15:16:14 crc kubenswrapper[4728]: I1216 15:16:14.858565 4728 generic.go:334] "Generic (PLEG): container finished" podID="3d185e68-d66c-438b-b4c2-bde356e4313e" containerID="a765450ffe387eb14fc11c7120b09befd31a586d0ce0af163b0ce45c794e0318" exitCode=0 Dec 16 15:16:14 crc kubenswrapper[4728]: I1216 15:16:14.858634 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bh899" event={"ID":"3d185e68-d66c-438b-b4c2-bde356e4313e","Type":"ContainerDied","Data":"a765450ffe387eb14fc11c7120b09befd31a586d0ce0af163b0ce45c794e0318"} Dec 16 15:16:15 crc kubenswrapper[4728]: I1216 15:16:15.869340 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a410ae7-1999-4ce1-a2bb-94158e1b917e","Type":"ContainerStarted","Data":"6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682"} Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.246132 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.435210 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvfwd\" (UniqueName: \"kubernetes.io/projected/3d185e68-d66c-438b-b4c2-bde356e4313e-kube-api-access-hvfwd\") pod \"3d185e68-d66c-438b-b4c2-bde356e4313e\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.435278 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-scripts\") pod \"3d185e68-d66c-438b-b4c2-bde356e4313e\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.435356 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-config-data\") pod \"3d185e68-d66c-438b-b4c2-bde356e4313e\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.435483 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-combined-ca-bundle\") pod \"3d185e68-d66c-438b-b4c2-bde356e4313e\" (UID: \"3d185e68-d66c-438b-b4c2-bde356e4313e\") " Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.440602 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d185e68-d66c-438b-b4c2-bde356e4313e-kube-api-access-hvfwd" (OuterVolumeSpecName: "kube-api-access-hvfwd") pod "3d185e68-d66c-438b-b4c2-bde356e4313e" (UID: "3d185e68-d66c-438b-b4c2-bde356e4313e"). InnerVolumeSpecName "kube-api-access-hvfwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.455607 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-scripts" (OuterVolumeSpecName: "scripts") pod "3d185e68-d66c-438b-b4c2-bde356e4313e" (UID: "3d185e68-d66c-438b-b4c2-bde356e4313e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.470526 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d185e68-d66c-438b-b4c2-bde356e4313e" (UID: "3d185e68-d66c-438b-b4c2-bde356e4313e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.482523 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-config-data" (OuterVolumeSpecName: "config-data") pod "3d185e68-d66c-438b-b4c2-bde356e4313e" (UID: "3d185e68-d66c-438b-b4c2-bde356e4313e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.536890 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.536925 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvfwd\" (UniqueName: \"kubernetes.io/projected/3d185e68-d66c-438b-b4c2-bde356e4313e-kube-api-access-hvfwd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.536935 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.536944 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d185e68-d66c-438b-b4c2-bde356e4313e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.881124 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bh899" event={"ID":"3d185e68-d66c-438b-b4c2-bde356e4313e","Type":"ContainerDied","Data":"ba46741d82bedf9dc4a421ce74f7877b01c58d1441c39c9af9eadf0c1b33c299"} Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.881425 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba46741d82bedf9dc4a421ce74f7877b01c58d1441c39c9af9eadf0c1b33c299" Dec 16 15:16:16 crc kubenswrapper[4728]: I1216 15:16:16.881480 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bh899" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.150896 4728 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.127765168s: [/var/lib/containers/storage/overlay/098417ceef5099e3abda933969f5693bac49cd0d5771b3ac994d4f6e21e7c2dd/diff /var/log/pods/openstack_placement-7dcd7544cd-gnxgg_7ac43e45-8d37-4ab4-9ebe-441421fe9044/placement-log/0.log]; will not log again for this container unless duration exceeds 2s Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.216397 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 15:16:18 crc kubenswrapper[4728]: E1216 15:16:18.216981 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d185e68-d66c-438b-b4c2-bde356e4313e" containerName="nova-cell0-conductor-db-sync" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.217013 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d185e68-d66c-438b-b4c2-bde356e4313e" containerName="nova-cell0-conductor-db-sync" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.217272 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d185e68-d66c-438b-b4c2-bde356e4313e" containerName="nova-cell0-conductor-db-sync" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.218241 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.221603 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zmrxn" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.221695 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.235708 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.273089 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4wf7\" (UniqueName: \"kubernetes.io/projected/06dceccb-f462-4eec-b6eb-e7b626c54b66-kube-api-access-v4wf7\") pod \"nova-cell0-conductor-0\" (UID: \"06dceccb-f462-4eec-b6eb-e7b626c54b66\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.273155 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06dceccb-f462-4eec-b6eb-e7b626c54b66-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"06dceccb-f462-4eec-b6eb-e7b626c54b66\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.273449 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06dceccb-f462-4eec-b6eb-e7b626c54b66-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"06dceccb-f462-4eec-b6eb-e7b626c54b66\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.375546 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4wf7\" (UniqueName: \"kubernetes.io/projected/06dceccb-f462-4eec-b6eb-e7b626c54b66-kube-api-access-v4wf7\") pod \"nova-cell0-conductor-0\" (UID: \"06dceccb-f462-4eec-b6eb-e7b626c54b66\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.375626 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06dceccb-f462-4eec-b6eb-e7b626c54b66-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"06dceccb-f462-4eec-b6eb-e7b626c54b66\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.375707 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06dceccb-f462-4eec-b6eb-e7b626c54b66-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"06dceccb-f462-4eec-b6eb-e7b626c54b66\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.380769 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06dceccb-f462-4eec-b6eb-e7b626c54b66-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"06dceccb-f462-4eec-b6eb-e7b626c54b66\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.380945 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06dceccb-f462-4eec-b6eb-e7b626c54b66-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"06dceccb-f462-4eec-b6eb-e7b626c54b66\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.392837 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4wf7\" (UniqueName: \"kubernetes.io/projected/06dceccb-f462-4eec-b6eb-e7b626c54b66-kube-api-access-v4wf7\") pod \"nova-cell0-conductor-0\" (UID: \"06dceccb-f462-4eec-b6eb-e7b626c54b66\") " pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.582164 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:18 crc kubenswrapper[4728]: I1216 15:16:18.901377 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a410ae7-1999-4ce1-a2bb-94158e1b917e","Type":"ContainerStarted","Data":"3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486"} Dec 16 15:16:19 crc kubenswrapper[4728]: I1216 15:16:19.032692 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 16 15:16:19 crc kubenswrapper[4728]: I1216 15:16:19.910794 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a410ae7-1999-4ce1-a2bb-94158e1b917e","Type":"ContainerStarted","Data":"86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01"} Dec 16 15:16:19 crc kubenswrapper[4728]: I1216 15:16:19.912557 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"06dceccb-f462-4eec-b6eb-e7b626c54b66","Type":"ContainerStarted","Data":"6cf9475e60af15f4f4f298ff43168f3a00b6cf707c754390afe35dfae4938042"} Dec 16 15:16:19 crc kubenswrapper[4728]: I1216 15:16:19.912895 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:19 crc kubenswrapper[4728]: I1216 15:16:19.912910 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"06dceccb-f462-4eec-b6eb-e7b626c54b66","Type":"ContainerStarted","Data":"29f47d019a933f25b1b587bfe613ab0f0b22f186dbb168867061b5a0e02ff408"} Dec 16 15:16:19 crc kubenswrapper[4728]: I1216 15:16:19.930295 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.930279085 podStartE2EDuration="1.930279085s" podCreationTimestamp="2025-12-16 15:16:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:19.924532619 +0000 UTC m=+1160.764711603" watchObservedRunningTime="2025-12-16 15:16:19.930279085 +0000 UTC m=+1160.770458069" Dec 16 15:16:20 crc kubenswrapper[4728]: I1216 15:16:20.926762 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a410ae7-1999-4ce1-a2bb-94158e1b917e","Type":"ContainerStarted","Data":"a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432"} Dec 16 15:16:20 crc kubenswrapper[4728]: I1216 15:16:20.927176 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:16:20 crc kubenswrapper[4728]: I1216 15:16:20.952065 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.3529035 podStartE2EDuration="7.952043132s" podCreationTimestamp="2025-12-16 15:16:13 +0000 UTC" firstStartedPulling="2025-12-16 15:16:13.96545093 +0000 UTC m=+1154.805629914" lastFinishedPulling="2025-12-16 15:16:20.564590522 +0000 UTC m=+1161.404769546" observedRunningTime="2025-12-16 15:16:20.948966798 +0000 UTC m=+1161.789145812" watchObservedRunningTime="2025-12-16 15:16:20.952043132 +0000 UTC m=+1161.792222136" Dec 16 15:16:24 crc kubenswrapper[4728]: I1216 15:16:24.842782 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 15:16:24 crc kubenswrapper[4728]: I1216 15:16:24.843486 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="916a6b2e-6b7b-457e-b2a2-80d02edc2217" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 16 15:16:28 crc kubenswrapper[4728]: I1216 15:16:28.612482 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.102593 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-q7mks"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.104798 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.115174 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.115555 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.117606 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-q7mks"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.194874 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-scripts\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.194939 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-config-data\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.195389 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.195519 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc8hs\" (UniqueName: \"kubernetes.io/projected/c23498e4-9faf-4b56-9d5b-89a616514d12-kube-api-access-tc8hs\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.291326 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.292690 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.296633 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.296687 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc8hs\" (UniqueName: \"kubernetes.io/projected/c23498e4-9faf-4b56-9d5b-89a616514d12-kube-api-access-tc8hs\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.296709 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-scripts\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.296733 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-config-data\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.302841 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.306112 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-config-data\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.308563 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.309562 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.310342 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.311590 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-scripts\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.319095 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.321988 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.333677 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.365794 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc8hs\" (UniqueName: \"kubernetes.io/projected/c23498e4-9faf-4b56-9d5b-89a616514d12-kube-api-access-tc8hs\") pod \"nova-cell0-cell-mapping-q7mks\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.398818 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.398943 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-config-data\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.399006 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-config-data\") pod \"nova-scheduler-0\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.399047 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m22n4\" (UniqueName: \"kubernetes.io/projected/958ad277-0abc-4eb0-a5c6-72839d6aec0d-kube-api-access-m22n4\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.399113 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/958ad277-0abc-4eb0-a5c6-72839d6aec0d-logs\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.399140 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.399201 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d7bd\" (UniqueName: \"kubernetes.io/projected/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-kube-api-access-6d7bd\") pod \"nova-scheduler-0\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.424847 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.460125 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.464187 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.472134 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.500453 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p4rz\" (UniqueName: \"kubernetes.io/projected/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-kube-api-access-5p4rz\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.500712 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.500749 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-logs\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.500772 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-config-data\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.500808 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.500828 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-config-data\") pod \"nova-scheduler-0\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.500850 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m22n4\" (UniqueName: \"kubernetes.io/projected/958ad277-0abc-4eb0-a5c6-72839d6aec0d-kube-api-access-m22n4\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.500889 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/958ad277-0abc-4eb0-a5c6-72839d6aec0d-logs\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.500904 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.500931 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-config-data\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.500960 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d7bd\" (UniqueName: \"kubernetes.io/projected/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-kube-api-access-6d7bd\") pod \"nova-scheduler-0\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.503552 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.517121 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/958ad277-0abc-4eb0-a5c6-72839d6aec0d-logs\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.524170 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.524428 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.550973 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-config-data\") pod \"nova-scheduler-0\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.556204 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-config-data\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.561570 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d7bd\" (UniqueName: \"kubernetes.io/projected/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-kube-api-access-6d7bd\") pod \"nova-scheduler-0\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.572747 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m22n4\" (UniqueName: \"kubernetes.io/projected/958ad277-0abc-4eb0-a5c6-72839d6aec0d-kube-api-access-m22n4\") pod \"nova-metadata-0\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.573487 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-c7xpj"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.574825 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.624579 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-c7xpj"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.626059 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-config-data\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.626231 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vt5f\" (UniqueName: \"kubernetes.io/projected/0ed56382-f55a-4e37-9ef1-c0725189578a-kube-api-access-5vt5f\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.626726 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p4rz\" (UniqueName: \"kubernetes.io/projected/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-kube-api-access-5p4rz\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.626763 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.626901 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.627050 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-config\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.627087 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-logs\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.627279 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.627534 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-svc\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.627734 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.639130 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.641717 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-logs\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.677799 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.688563 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-config-data\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.694133 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p4rz\" (UniqueName: \"kubernetes.io/projected/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-kube-api-access-5p4rz\") pod \"nova-api-0\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.696556 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.712487 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.714511 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.717323 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.724254 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.730283 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.730332 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-svc\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.730487 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vt5f\" (UniqueName: \"kubernetes.io/projected/0ed56382-f55a-4e37-9ef1-c0725189578a-kube-api-access-5vt5f\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.730553 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.730603 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.730631 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-config\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.731538 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-config\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.733222 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-svc\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.734462 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.734925 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.735420 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.735613 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.778172 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vt5f\" (UniqueName: \"kubernetes.io/projected/0ed56382-f55a-4e37-9ef1-c0725189578a-kube-api-access-5vt5f\") pod \"dnsmasq-dns-757b4f8459-c7xpj\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.831924 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.831995 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pzfb\" (UniqueName: \"kubernetes.io/projected/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-kube-api-access-7pzfb\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.832453 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.934650 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pzfb\" (UniqueName: \"kubernetes.io/projected/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-kube-api-access-7pzfb\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.934771 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.934818 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.940129 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.940974 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:29 crc kubenswrapper[4728]: I1216 15:16:29.954806 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pzfb\" (UniqueName: \"kubernetes.io/projected/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-kube-api-access-7pzfb\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.041788 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.057099 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-q7mks"] Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.066999 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.226545 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:16:30 crc kubenswrapper[4728]: W1216 15:16:30.258582 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82e422bb_b40d_4ccf_a882_e6b8ea94cc70.slice/crio-7b6a9105e93c3db52aa957e624d3bdaf0a0057bda8bba95d02d652a7a2ea74e2 WatchSource:0}: Error finding container 7b6a9105e93c3db52aa957e624d3bdaf0a0057bda8bba95d02d652a7a2ea74e2: Status 404 returned error can't find the container with id 7b6a9105e93c3db52aa957e624d3bdaf0a0057bda8bba95d02d652a7a2ea74e2 Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.260937 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fkg4z"] Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.262201 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.264782 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.264954 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.271517 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fkg4z"] Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.305949 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:30 crc kubenswrapper[4728]: W1216 15:16:30.310926 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod958ad277_0abc_4eb0_a5c6_72839d6aec0d.slice/crio-2e0f5efdf323c2ce6101800ace7061eaa8be521b4db9bf8b8354ad2be4d2ea99 WatchSource:0}: Error finding container 2e0f5efdf323c2ce6101800ace7061eaa8be521b4db9bf8b8354ad2be4d2ea99: Status 404 returned error can't find the container with id 2e0f5efdf323c2ce6101800ace7061eaa8be521b4db9bf8b8354ad2be4d2ea99 Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.343163 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qk7h\" (UniqueName: \"kubernetes.io/projected/67785338-b264-4d86-b1b5-6ca4248d938f-kube-api-access-9qk7h\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.343233 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-config-data\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.343337 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.343384 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-scripts\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.366289 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.445690 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qk7h\" (UniqueName: \"kubernetes.io/projected/67785338-b264-4d86-b1b5-6ca4248d938f-kube-api-access-9qk7h\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.445745 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-config-data\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.445846 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.445870 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-scripts\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.450465 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-config-data\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.451250 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-scripts\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.451429 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.465568 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qk7h\" (UniqueName: \"kubernetes.io/projected/67785338-b264-4d86-b1b5-6ca4248d938f-kube-api-access-9qk7h\") pod \"nova-cell1-conductor-db-sync-fkg4z\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.569813 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-c7xpj"] Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.583764 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:30 crc kubenswrapper[4728]: W1216 15:16:30.791035 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f3c8e2b_5ba5_482f_9e11_b3daacb44963.slice/crio-a2bc05e2234fec534688ba06e26f955c96cc43eef7b21b32d534f572dfa9ec7b WatchSource:0}: Error finding container a2bc05e2234fec534688ba06e26f955c96cc43eef7b21b32d534f572dfa9ec7b: Status 404 returned error can't find the container with id a2bc05e2234fec534688ba06e26f955c96cc43eef7b21b32d534f572dfa9ec7b Dec 16 15:16:30 crc kubenswrapper[4728]: I1216 15:16:30.813489 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:16:31 crc kubenswrapper[4728]: I1216 15:16:31.046582 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q7mks" event={"ID":"c23498e4-9faf-4b56-9d5b-89a616514d12","Type":"ContainerStarted","Data":"be8581302b8e03def72156a946a173be936535bd4023a90af740c0acea38830a"} Dec 16 15:16:31 crc kubenswrapper[4728]: I1216 15:16:31.048154 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"958ad277-0abc-4eb0-a5c6-72839d6aec0d","Type":"ContainerStarted","Data":"2e0f5efdf323c2ce6101800ace7061eaa8be521b4db9bf8b8354ad2be4d2ea99"} Dec 16 15:16:31 crc kubenswrapper[4728]: I1216 15:16:31.049422 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" event={"ID":"0ed56382-f55a-4e37-9ef1-c0725189578a","Type":"ContainerStarted","Data":"08efae766f9067d98e209d5177d43466ebb92c0082374974ddad32b2f60e63cc"} Dec 16 15:16:31 crc kubenswrapper[4728]: I1216 15:16:31.050645 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f3c8e2b-5ba5-482f-9e11-b3daacb44963","Type":"ContainerStarted","Data":"a2bc05e2234fec534688ba06e26f955c96cc43eef7b21b32d534f572dfa9ec7b"} Dec 16 15:16:31 crc kubenswrapper[4728]: I1216 15:16:31.051919 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9545cf88-59bd-4e97-9eb2-25b59b0f3db1","Type":"ContainerStarted","Data":"311e470bd06f0075dfe81e35da2b2ec49face52972d48215635de442d7a63554"} Dec 16 15:16:31 crc kubenswrapper[4728]: I1216 15:16:31.052847 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"82e422bb-b40d-4ccf-a882-e6b8ea94cc70","Type":"ContainerStarted","Data":"7b6a9105e93c3db52aa957e624d3bdaf0a0057bda8bba95d02d652a7a2ea74e2"} Dec 16 15:16:31 crc kubenswrapper[4728]: I1216 15:16:31.101244 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fkg4z"] Dec 16 15:16:32 crc kubenswrapper[4728]: I1216 15:16:32.063678 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fkg4z" event={"ID":"67785338-b264-4d86-b1b5-6ca4248d938f","Type":"ContainerStarted","Data":"f8d15615f6b792fe02a5897a56834f724b5d4405890176b86fee7e4273a2c4a8"} Dec 16 15:16:33 crc kubenswrapper[4728]: I1216 15:16:33.072531 4728 generic.go:334] "Generic (PLEG): container finished" podID="0ed56382-f55a-4e37-9ef1-c0725189578a" containerID="e8e92ac33e785a8b45e5457bcccaef16500e94fbff2eaafbefd763a86e0db357" exitCode=0 Dec 16 15:16:33 crc kubenswrapper[4728]: I1216 15:16:33.072879 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" event={"ID":"0ed56382-f55a-4e37-9ef1-c0725189578a","Type":"ContainerDied","Data":"e8e92ac33e785a8b45e5457bcccaef16500e94fbff2eaafbefd763a86e0db357"} Dec 16 15:16:33 crc kubenswrapper[4728]: I1216 15:16:33.079413 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q7mks" event={"ID":"c23498e4-9faf-4b56-9d5b-89a616514d12","Type":"ContainerStarted","Data":"e14a33014110e81fd38c867f8181b9ccd0a7c3b800cea2ffeb1ed3977faa7a9d"} Dec 16 15:16:33 crc kubenswrapper[4728]: I1216 15:16:33.087150 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fkg4z" event={"ID":"67785338-b264-4d86-b1b5-6ca4248d938f","Type":"ContainerStarted","Data":"9434c9173fb0736f4e65b34da0a27088919248ae8bcafea8cb472449cda9bf4e"} Dec 16 15:16:33 crc kubenswrapper[4728]: I1216 15:16:33.139026 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-q7mks" podStartSLOduration=4.139008364 podStartE2EDuration="4.139008364s" podCreationTimestamp="2025-12-16 15:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:33.121002335 +0000 UTC m=+1173.961181319" watchObservedRunningTime="2025-12-16 15:16:33.139008364 +0000 UTC m=+1173.979187348" Dec 16 15:16:33 crc kubenswrapper[4728]: I1216 15:16:33.144391 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-fkg4z" podStartSLOduration=3.14437209 podStartE2EDuration="3.14437209s" podCreationTimestamp="2025-12-16 15:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:33.133217677 +0000 UTC m=+1173.973396661" watchObservedRunningTime="2025-12-16 15:16:33.14437209 +0000 UTC m=+1173.984551074" Dec 16 15:16:33 crc kubenswrapper[4728]: I1216 15:16:33.438378 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:33 crc kubenswrapper[4728]: I1216 15:16:33.454297 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:16:34 crc kubenswrapper[4728]: I1216 15:16:34.099148 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" event={"ID":"0ed56382-f55a-4e37-9ef1-c0725189578a","Type":"ContainerStarted","Data":"2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa"} Dec 16 15:16:34 crc kubenswrapper[4728]: I1216 15:16:34.137067 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" podStartSLOduration=5.137036645 podStartE2EDuration="5.137036645s" podCreationTimestamp="2025-12-16 15:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:34.11953835 +0000 UTC m=+1174.959717364" watchObservedRunningTime="2025-12-16 15:16:34.137036645 +0000 UTC m=+1174.977215649" Dec 16 15:16:35 crc kubenswrapper[4728]: I1216 15:16:35.042154 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:38 crc kubenswrapper[4728]: I1216 15:16:38.158154 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"82e422bb-b40d-4ccf-a882-e6b8ea94cc70","Type":"ContainerStarted","Data":"813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98"} Dec 16 15:16:38 crc kubenswrapper[4728]: I1216 15:16:38.161830 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f3c8e2b-5ba5-482f-9e11-b3daacb44963","Type":"ContainerStarted","Data":"b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12"} Dec 16 15:16:38 crc kubenswrapper[4728]: I1216 15:16:38.162066 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1f3c8e2b-5ba5-482f-9e11-b3daacb44963" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12" gracePeriod=30 Dec 16 15:16:38 crc kubenswrapper[4728]: I1216 15:16:38.169071 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9545cf88-59bd-4e97-9eb2-25b59b0f3db1","Type":"ContainerStarted","Data":"c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989"} Dec 16 15:16:38 crc kubenswrapper[4728]: I1216 15:16:38.180992 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.232191065 podStartE2EDuration="9.18097219s" podCreationTimestamp="2025-12-16 15:16:29 +0000 UTC" firstStartedPulling="2025-12-16 15:16:30.279876666 +0000 UTC m=+1171.120055650" lastFinishedPulling="2025-12-16 15:16:37.228657781 +0000 UTC m=+1178.068836775" observedRunningTime="2025-12-16 15:16:38.174802113 +0000 UTC m=+1179.014981107" watchObservedRunningTime="2025-12-16 15:16:38.18097219 +0000 UTC m=+1179.021151174" Dec 16 15:16:38 crc kubenswrapper[4728]: I1216 15:16:38.221254 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.799125992 podStartE2EDuration="9.221232964s" podCreationTimestamp="2025-12-16 15:16:29 +0000 UTC" firstStartedPulling="2025-12-16 15:16:30.808531283 +0000 UTC m=+1171.648710267" lastFinishedPulling="2025-12-16 15:16:37.230638255 +0000 UTC m=+1178.070817239" observedRunningTime="2025-12-16 15:16:38.205115187 +0000 UTC m=+1179.045294191" watchObservedRunningTime="2025-12-16 15:16:38.221232964 +0000 UTC m=+1179.061411958" Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.198024 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"958ad277-0abc-4eb0-a5c6-72839d6aec0d","Type":"ContainerStarted","Data":"4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102"} Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.198464 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"958ad277-0abc-4eb0-a5c6-72839d6aec0d","Type":"ContainerStarted","Data":"c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d"} Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.198166 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="958ad277-0abc-4eb0-a5c6-72839d6aec0d" containerName="nova-metadata-log" containerID="cri-o://c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d" gracePeriod=30 Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.198351 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="958ad277-0abc-4eb0-a5c6-72839d6aec0d" containerName="nova-metadata-metadata" containerID="cri-o://4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102" gracePeriod=30 Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.212707 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9545cf88-59bd-4e97-9eb2-25b59b0f3db1","Type":"ContainerStarted","Data":"fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f"} Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.225246 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.309517753 podStartE2EDuration="10.225228948s" podCreationTimestamp="2025-12-16 15:16:29 +0000 UTC" firstStartedPulling="2025-12-16 15:16:30.312291178 +0000 UTC m=+1171.152470162" lastFinishedPulling="2025-12-16 15:16:37.228002353 +0000 UTC m=+1178.068181357" observedRunningTime="2025-12-16 15:16:39.222887794 +0000 UTC m=+1180.063066828" watchObservedRunningTime="2025-12-16 15:16:39.225228948 +0000 UTC m=+1180.065407932" Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.248349 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.394729228 podStartE2EDuration="10.248332576s" podCreationTimestamp="2025-12-16 15:16:29 +0000 UTC" firstStartedPulling="2025-12-16 15:16:30.376581115 +0000 UTC m=+1171.216760099" lastFinishedPulling="2025-12-16 15:16:37.230184443 +0000 UTC m=+1178.070363447" observedRunningTime="2025-12-16 15:16:39.247225456 +0000 UTC m=+1180.087404430" watchObservedRunningTime="2025-12-16 15:16:39.248332576 +0000 UTC m=+1180.088511560" Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.641027 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.641466 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.668705 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.696939 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.696994 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.731359 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.731418 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:16:39 crc kubenswrapper[4728]: I1216 15:16:39.967174 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.044050 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.050515 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-config-data\") pod \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.050596 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/958ad277-0abc-4eb0-a5c6-72839d6aec0d-logs\") pod \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.050623 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-combined-ca-bundle\") pod \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.050674 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m22n4\" (UniqueName: \"kubernetes.io/projected/958ad277-0abc-4eb0-a5c6-72839d6aec0d-kube-api-access-m22n4\") pod \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\" (UID: \"958ad277-0abc-4eb0-a5c6-72839d6aec0d\") " Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.051298 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/958ad277-0abc-4eb0-a5c6-72839d6aec0d-logs" (OuterVolumeSpecName: "logs") pod "958ad277-0abc-4eb0-a5c6-72839d6aec0d" (UID: "958ad277-0abc-4eb0-a5c6-72839d6aec0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.055934 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/958ad277-0abc-4eb0-a5c6-72839d6aec0d-kube-api-access-m22n4" (OuterVolumeSpecName: "kube-api-access-m22n4") pod "958ad277-0abc-4eb0-a5c6-72839d6aec0d" (UID: "958ad277-0abc-4eb0-a5c6-72839d6aec0d"). InnerVolumeSpecName "kube-api-access-m22n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.067878 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.094275 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "958ad277-0abc-4eb0-a5c6-72839d6aec0d" (UID: "958ad277-0abc-4eb0-a5c6-72839d6aec0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.112509 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-config-data" (OuterVolumeSpecName: "config-data") pod "958ad277-0abc-4eb0-a5c6-72839d6aec0d" (UID: "958ad277-0abc-4eb0-a5c6-72839d6aec0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.115456 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jx95k"] Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.115914 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" podUID="eae8087e-b189-4db5-b646-d421f63a4828" containerName="dnsmasq-dns" containerID="cri-o://1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56" gracePeriod=10 Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.159303 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m22n4\" (UniqueName: \"kubernetes.io/projected/958ad277-0abc-4eb0-a5c6-72839d6aec0d-kube-api-access-m22n4\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.159343 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.159359 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/958ad277-0abc-4eb0-a5c6-72839d6aec0d-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.159371 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958ad277-0abc-4eb0-a5c6-72839d6aec0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.225343 4728 generic.go:334] "Generic (PLEG): container finished" podID="c23498e4-9faf-4b56-9d5b-89a616514d12" containerID="e14a33014110e81fd38c867f8181b9ccd0a7c3b800cea2ffeb1ed3977faa7a9d" exitCode=0 Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.225423 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q7mks" event={"ID":"c23498e4-9faf-4b56-9d5b-89a616514d12","Type":"ContainerDied","Data":"e14a33014110e81fd38c867f8181b9ccd0a7c3b800cea2ffeb1ed3977faa7a9d"} Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.233574 4728 generic.go:334] "Generic (PLEG): container finished" podID="958ad277-0abc-4eb0-a5c6-72839d6aec0d" containerID="4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102" exitCode=0 Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.233610 4728 generic.go:334] "Generic (PLEG): container finished" podID="958ad277-0abc-4eb0-a5c6-72839d6aec0d" containerID="c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d" exitCode=143 Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.234579 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.238242 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"958ad277-0abc-4eb0-a5c6-72839d6aec0d","Type":"ContainerDied","Data":"4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102"} Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.238286 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"958ad277-0abc-4eb0-a5c6-72839d6aec0d","Type":"ContainerDied","Data":"c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d"} Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.238299 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"958ad277-0abc-4eb0-a5c6-72839d6aec0d","Type":"ContainerDied","Data":"2e0f5efdf323c2ce6101800ace7061eaa8be521b4db9bf8b8354ad2be4d2ea99"} Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.238318 4728 scope.go:117] "RemoveContainer" containerID="4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.280674 4728 scope.go:117] "RemoveContainer" containerID="c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.288101 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.304374 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.317584 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.332225 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:40 crc kubenswrapper[4728]: E1216 15:16:40.332682 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="958ad277-0abc-4eb0-a5c6-72839d6aec0d" containerName="nova-metadata-log" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.332699 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="958ad277-0abc-4eb0-a5c6-72839d6aec0d" containerName="nova-metadata-log" Dec 16 15:16:40 crc kubenswrapper[4728]: E1216 15:16:40.332746 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="958ad277-0abc-4eb0-a5c6-72839d6aec0d" containerName="nova-metadata-metadata" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.332756 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="958ad277-0abc-4eb0-a5c6-72839d6aec0d" containerName="nova-metadata-metadata" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.332976 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="958ad277-0abc-4eb0-a5c6-72839d6aec0d" containerName="nova-metadata-metadata" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.333006 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="958ad277-0abc-4eb0-a5c6-72839d6aec0d" containerName="nova-metadata-log" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.348716 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.349021 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.349339 4728 scope.go:117] "RemoveContainer" containerID="4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.352442 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.352775 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 15:16:40 crc kubenswrapper[4728]: E1216 15:16:40.366524 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102\": container with ID starting with 4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102 not found: ID does not exist" containerID="4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.366782 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102"} err="failed to get container status \"4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102\": rpc error: code = NotFound desc = could not find container \"4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102\": container with ID starting with 4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102 not found: ID does not exist" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.366808 4728 scope.go:117] "RemoveContainer" containerID="c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d" Dec 16 15:16:40 crc kubenswrapper[4728]: E1216 15:16:40.369888 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d\": container with ID starting with c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d not found: ID does not exist" containerID="c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.369938 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d"} err="failed to get container status \"c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d\": rpc error: code = NotFound desc = could not find container \"c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d\": container with ID starting with c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d not found: ID does not exist" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.369969 4728 scope.go:117] "RemoveContainer" containerID="4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.370285 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102"} err="failed to get container status \"4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102\": rpc error: code = NotFound desc = could not find container \"4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102\": container with ID starting with 4a9a60457e9158dac32ea5443cf832cb2fcd996246496ca17e2a3fddd4b61102 not found: ID does not exist" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.370340 4728 scope.go:117] "RemoveContainer" containerID="c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.371300 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d"} err="failed to get container status \"c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d\": rpc error: code = NotFound desc = could not find container \"c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d\": container with ID starting with c4ea5c305da89ccf56d18e04196af443603db1acc6d807c4ab7e97397589f76d not found: ID does not exist" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.464298 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3139eb80-61ac-456d-9834-707a65c6ff98-logs\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.464367 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-config-data\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.464560 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.464615 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.464647 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdt95\" (UniqueName: \"kubernetes.io/projected/3139eb80-61ac-456d-9834-707a65c6ff98-kube-api-access-wdt95\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.566479 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.566614 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.566652 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdt95\" (UniqueName: \"kubernetes.io/projected/3139eb80-61ac-456d-9834-707a65c6ff98-kube-api-access-wdt95\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.566710 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3139eb80-61ac-456d-9834-707a65c6ff98-logs\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.566735 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-config-data\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.567993 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3139eb80-61ac-456d-9834-707a65c6ff98-logs\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.573897 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.577320 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.599376 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-config-data\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.613933 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdt95\" (UniqueName: \"kubernetes.io/projected/3139eb80-61ac-456d-9834-707a65c6ff98-kube-api-access-wdt95\") pod \"nova-metadata-0\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.711304 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.728034 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.832639 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.832650 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.871482 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-sb\") pod \"eae8087e-b189-4db5-b646-d421f63a4828\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.871591 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-swift-storage-0\") pod \"eae8087e-b189-4db5-b646-d421f63a4828\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.871689 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-config\") pod \"eae8087e-b189-4db5-b646-d421f63a4828\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.871720 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-svc\") pod \"eae8087e-b189-4db5-b646-d421f63a4828\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.871827 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4z6q\" (UniqueName: \"kubernetes.io/projected/eae8087e-b189-4db5-b646-d421f63a4828-kube-api-access-w4z6q\") pod \"eae8087e-b189-4db5-b646-d421f63a4828\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.871856 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-nb\") pod \"eae8087e-b189-4db5-b646-d421f63a4828\" (UID: \"eae8087e-b189-4db5-b646-d421f63a4828\") " Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.881063 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae8087e-b189-4db5-b646-d421f63a4828-kube-api-access-w4z6q" (OuterVolumeSpecName: "kube-api-access-w4z6q") pod "eae8087e-b189-4db5-b646-d421f63a4828" (UID: "eae8087e-b189-4db5-b646-d421f63a4828"). InnerVolumeSpecName "kube-api-access-w4z6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.935434 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-config" (OuterVolumeSpecName: "config") pod "eae8087e-b189-4db5-b646-d421f63a4828" (UID: "eae8087e-b189-4db5-b646-d421f63a4828"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.945098 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eae8087e-b189-4db5-b646-d421f63a4828" (UID: "eae8087e-b189-4db5-b646-d421f63a4828"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.951046 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eae8087e-b189-4db5-b646-d421f63a4828" (UID: "eae8087e-b189-4db5-b646-d421f63a4828"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.954192 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eae8087e-b189-4db5-b646-d421f63a4828" (UID: "eae8087e-b189-4db5-b646-d421f63a4828"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.960478 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eae8087e-b189-4db5-b646-d421f63a4828" (UID: "eae8087e-b189-4db5-b646-d421f63a4828"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.974160 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4z6q\" (UniqueName: \"kubernetes.io/projected/eae8087e-b189-4db5-b646-d421f63a4828-kube-api-access-w4z6q\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.974363 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.974500 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.974584 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.974665 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:40 crc kubenswrapper[4728]: I1216 15:16:40.974760 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eae8087e-b189-4db5-b646-d421f63a4828-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.239539 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.248969 4728 generic.go:334] "Generic (PLEG): container finished" podID="eae8087e-b189-4db5-b646-d421f63a4828" containerID="1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56" exitCode=0 Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.249347 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" event={"ID":"eae8087e-b189-4db5-b646-d421f63a4828","Type":"ContainerDied","Data":"1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56"} Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.249378 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" event={"ID":"eae8087e-b189-4db5-b646-d421f63a4828","Type":"ContainerDied","Data":"72d541b05fca4293c3199a0c11d3048f323320a3ad5dfb55c21e7120e14f8a3b"} Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.249399 4728 scope.go:117] "RemoveContainer" containerID="1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.249565 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jx95k" Dec 16 15:16:41 crc kubenswrapper[4728]: W1216 15:16:41.271227 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3139eb80_61ac_456d_9834_707a65c6ff98.slice/crio-6ed65fed8e8e0552eee6b90cc673891c79b57972bb042a24608b5bcbfb932c0d WatchSource:0}: Error finding container 6ed65fed8e8e0552eee6b90cc673891c79b57972bb042a24608b5bcbfb932c0d: Status 404 returned error can't find the container with id 6ed65fed8e8e0552eee6b90cc673891c79b57972bb042a24608b5bcbfb932c0d Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.294127 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jx95k"] Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.302795 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jx95k"] Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.306748 4728 scope.go:117] "RemoveContainer" containerID="36205ca294dacdd2683d7f17a5a6d62b99b7d37f0bf2c71b85db58eabfcd3d29" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.427131 4728 scope.go:117] "RemoveContainer" containerID="1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56" Dec 16 15:16:41 crc kubenswrapper[4728]: E1216 15:16:41.427488 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56\": container with ID starting with 1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56 not found: ID does not exist" containerID="1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.427513 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56"} err="failed to get container status \"1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56\": rpc error: code = NotFound desc = could not find container \"1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56\": container with ID starting with 1f93d8d950e9287aedc95ead3de3579b4d18653071ee8898d89391d8149fcc56 not found: ID does not exist" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.427531 4728 scope.go:117] "RemoveContainer" containerID="36205ca294dacdd2683d7f17a5a6d62b99b7d37f0bf2c71b85db58eabfcd3d29" Dec 16 15:16:41 crc kubenswrapper[4728]: E1216 15:16:41.427766 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36205ca294dacdd2683d7f17a5a6d62b99b7d37f0bf2c71b85db58eabfcd3d29\": container with ID starting with 36205ca294dacdd2683d7f17a5a6d62b99b7d37f0bf2c71b85db58eabfcd3d29 not found: ID does not exist" containerID="36205ca294dacdd2683d7f17a5a6d62b99b7d37f0bf2c71b85db58eabfcd3d29" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.427785 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36205ca294dacdd2683d7f17a5a6d62b99b7d37f0bf2c71b85db58eabfcd3d29"} err="failed to get container status \"36205ca294dacdd2683d7f17a5a6d62b99b7d37f0bf2c71b85db58eabfcd3d29\": rpc error: code = NotFound desc = could not find container \"36205ca294dacdd2683d7f17a5a6d62b99b7d37f0bf2c71b85db58eabfcd3d29\": container with ID starting with 36205ca294dacdd2683d7f17a5a6d62b99b7d37f0bf2c71b85db58eabfcd3d29 not found: ID does not exist" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.532307 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="958ad277-0abc-4eb0-a5c6-72839d6aec0d" path="/var/lib/kubelet/pods/958ad277-0abc-4eb0-a5c6-72839d6aec0d/volumes" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.533079 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eae8087e-b189-4db5-b646-d421f63a4828" path="/var/lib/kubelet/pods/eae8087e-b189-4db5-b646-d421f63a4828/volumes" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.705344 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.895401 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-combined-ca-bundle\") pod \"c23498e4-9faf-4b56-9d5b-89a616514d12\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.895865 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-config-data\") pod \"c23498e4-9faf-4b56-9d5b-89a616514d12\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.895983 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-scripts\") pod \"c23498e4-9faf-4b56-9d5b-89a616514d12\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.896101 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc8hs\" (UniqueName: \"kubernetes.io/projected/c23498e4-9faf-4b56-9d5b-89a616514d12-kube-api-access-tc8hs\") pod \"c23498e4-9faf-4b56-9d5b-89a616514d12\" (UID: \"c23498e4-9faf-4b56-9d5b-89a616514d12\") " Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.898912 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-scripts" (OuterVolumeSpecName: "scripts") pod "c23498e4-9faf-4b56-9d5b-89a616514d12" (UID: "c23498e4-9faf-4b56-9d5b-89a616514d12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.900852 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c23498e4-9faf-4b56-9d5b-89a616514d12-kube-api-access-tc8hs" (OuterVolumeSpecName: "kube-api-access-tc8hs") pod "c23498e4-9faf-4b56-9d5b-89a616514d12" (UID: "c23498e4-9faf-4b56-9d5b-89a616514d12"). InnerVolumeSpecName "kube-api-access-tc8hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.935848 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c23498e4-9faf-4b56-9d5b-89a616514d12" (UID: "c23498e4-9faf-4b56-9d5b-89a616514d12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.947866 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-config-data" (OuterVolumeSpecName: "config-data") pod "c23498e4-9faf-4b56-9d5b-89a616514d12" (UID: "c23498e4-9faf-4b56-9d5b-89a616514d12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.998947 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.998980 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.998993 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23498e4-9faf-4b56-9d5b-89a616514d12-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:41 crc kubenswrapper[4728]: I1216 15:16:41.999006 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc8hs\" (UniqueName: \"kubernetes.io/projected/c23498e4-9faf-4b56-9d5b-89a616514d12-kube-api-access-tc8hs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.274179 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3139eb80-61ac-456d-9834-707a65c6ff98","Type":"ContainerStarted","Data":"9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf"} Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.274225 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3139eb80-61ac-456d-9834-707a65c6ff98","Type":"ContainerStarted","Data":"45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff"} Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.274236 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3139eb80-61ac-456d-9834-707a65c6ff98","Type":"ContainerStarted","Data":"6ed65fed8e8e0552eee6b90cc673891c79b57972bb042a24608b5bcbfb932c0d"} Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.278109 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q7mks" event={"ID":"c23498e4-9faf-4b56-9d5b-89a616514d12","Type":"ContainerDied","Data":"be8581302b8e03def72156a946a173be936535bd4023a90af740c0acea38830a"} Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.278164 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be8581302b8e03def72156a946a173be936535bd4023a90af740c0acea38830a" Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.278907 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q7mks" Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.312499 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.312481735 podStartE2EDuration="2.312481735s" podCreationTimestamp="2025-12-16 15:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:42.298773272 +0000 UTC m=+1183.138952266" watchObservedRunningTime="2025-12-16 15:16:42.312481735 +0000 UTC m=+1183.152660719" Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.426365 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.434784 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.435241 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerName="nova-api-log" containerID="cri-o://c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989" gracePeriod=30 Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.435466 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerName="nova-api-api" containerID="cri-o://fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f" gracePeriod=30 Dec 16 15:16:42 crc kubenswrapper[4728]: I1216 15:16:42.443790 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:43 crc kubenswrapper[4728]: I1216 15:16:43.297294 4728 generic.go:334] "Generic (PLEG): container finished" podID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerID="c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989" exitCode=143 Dec 16 15:16:43 crc kubenswrapper[4728]: I1216 15:16:43.297336 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9545cf88-59bd-4e97-9eb2-25b59b0f3db1","Type":"ContainerDied","Data":"c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989"} Dec 16 15:16:43 crc kubenswrapper[4728]: I1216 15:16:43.297570 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="82e422bb-b40d-4ccf-a882-e6b8ea94cc70" containerName="nova-scheduler-scheduler" containerID="cri-o://813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98" gracePeriod=30 Dec 16 15:16:43 crc kubenswrapper[4728]: I1216 15:16:43.536359 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 15:16:44 crc kubenswrapper[4728]: I1216 15:16:44.306623 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3139eb80-61ac-456d-9834-707a65c6ff98" containerName="nova-metadata-log" containerID="cri-o://45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff" gracePeriod=30 Dec 16 15:16:44 crc kubenswrapper[4728]: I1216 15:16:44.306686 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3139eb80-61ac-456d-9834-707a65c6ff98" containerName="nova-metadata-metadata" containerID="cri-o://9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf" gracePeriod=30 Dec 16 15:16:44 crc kubenswrapper[4728]: E1216 15:16:44.642622 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:16:44 crc kubenswrapper[4728]: E1216 15:16:44.647067 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:16:44 crc kubenswrapper[4728]: E1216 15:16:44.650778 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:16:44 crc kubenswrapper[4728]: E1216 15:16:44.650814 4728 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="82e422bb-b40d-4ccf-a882-e6b8ea94cc70" containerName="nova-scheduler-scheduler" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.220099 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.274505 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-combined-ca-bundle\") pod \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.274670 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-config-data\") pod \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.274815 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d7bd\" (UniqueName: \"kubernetes.io/projected/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-kube-api-access-6d7bd\") pod \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\" (UID: \"82e422bb-b40d-4ccf-a882-e6b8ea94cc70\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.285822 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-kube-api-access-6d7bd" (OuterVolumeSpecName: "kube-api-access-6d7bd") pod "82e422bb-b40d-4ccf-a882-e6b8ea94cc70" (UID: "82e422bb-b40d-4ccf-a882-e6b8ea94cc70"). InnerVolumeSpecName "kube-api-access-6d7bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.314166 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82e422bb-b40d-4ccf-a882-e6b8ea94cc70" (UID: "82e422bb-b40d-4ccf-a882-e6b8ea94cc70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.319453 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.327058 4728 generic.go:334] "Generic (PLEG): container finished" podID="82e422bb-b40d-4ccf-a882-e6b8ea94cc70" containerID="813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98" exitCode=0 Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.327147 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"82e422bb-b40d-4ccf-a882-e6b8ea94cc70","Type":"ContainerDied","Data":"813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98"} Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.327175 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"82e422bb-b40d-4ccf-a882-e6b8ea94cc70","Type":"ContainerDied","Data":"7b6a9105e93c3db52aa957e624d3bdaf0a0057bda8bba95d02d652a7a2ea74e2"} Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.327190 4728 scope.go:117] "RemoveContainer" containerID="813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.327296 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.330132 4728 generic.go:334] "Generic (PLEG): container finished" podID="3139eb80-61ac-456d-9834-707a65c6ff98" containerID="9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf" exitCode=0 Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.330158 4728 generic.go:334] "Generic (PLEG): container finished" podID="3139eb80-61ac-456d-9834-707a65c6ff98" containerID="45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff" exitCode=143 Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.330168 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3139eb80-61ac-456d-9834-707a65c6ff98","Type":"ContainerDied","Data":"9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf"} Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.330200 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.330213 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3139eb80-61ac-456d-9834-707a65c6ff98","Type":"ContainerDied","Data":"45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff"} Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.330227 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3139eb80-61ac-456d-9834-707a65c6ff98","Type":"ContainerDied","Data":"6ed65fed8e8e0552eee6b90cc673891c79b57972bb042a24608b5bcbfb932c0d"} Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.330609 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-config-data" (OuterVolumeSpecName: "config-data") pod "82e422bb-b40d-4ccf-a882-e6b8ea94cc70" (UID: "82e422bb-b40d-4ccf-a882-e6b8ea94cc70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.352972 4728 scope.go:117] "RemoveContainer" containerID="813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98" Dec 16 15:16:45 crc kubenswrapper[4728]: E1216 15:16:45.353531 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98\": container with ID starting with 813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98 not found: ID does not exist" containerID="813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.353569 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98"} err="failed to get container status \"813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98\": rpc error: code = NotFound desc = could not find container \"813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98\": container with ID starting with 813236339f814e75619841eec95d4794540802530fb5fb4a3322e3e22f56fb98 not found: ID does not exist" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.353598 4728 scope.go:117] "RemoveContainer" containerID="9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.377245 4728 scope.go:117] "RemoveContainer" containerID="45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.380500 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3139eb80-61ac-456d-9834-707a65c6ff98-logs\") pod \"3139eb80-61ac-456d-9834-707a65c6ff98\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.380596 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-config-data\") pod \"3139eb80-61ac-456d-9834-707a65c6ff98\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.380648 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-combined-ca-bundle\") pod \"3139eb80-61ac-456d-9834-707a65c6ff98\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.380786 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdt95\" (UniqueName: \"kubernetes.io/projected/3139eb80-61ac-456d-9834-707a65c6ff98-kube-api-access-wdt95\") pod \"3139eb80-61ac-456d-9834-707a65c6ff98\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.380823 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-nova-metadata-tls-certs\") pod \"3139eb80-61ac-456d-9834-707a65c6ff98\" (UID: \"3139eb80-61ac-456d-9834-707a65c6ff98\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.380929 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3139eb80-61ac-456d-9834-707a65c6ff98-logs" (OuterVolumeSpecName: "logs") pod "3139eb80-61ac-456d-9834-707a65c6ff98" (UID: "3139eb80-61ac-456d-9834-707a65c6ff98"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.381273 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d7bd\" (UniqueName: \"kubernetes.io/projected/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-kube-api-access-6d7bd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.381293 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3139eb80-61ac-456d-9834-707a65c6ff98-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.381303 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.381313 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e422bb-b40d-4ccf-a882-e6b8ea94cc70-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.386062 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3139eb80-61ac-456d-9834-707a65c6ff98-kube-api-access-wdt95" (OuterVolumeSpecName: "kube-api-access-wdt95") pod "3139eb80-61ac-456d-9834-707a65c6ff98" (UID: "3139eb80-61ac-456d-9834-707a65c6ff98"). InnerVolumeSpecName "kube-api-access-wdt95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.402778 4728 scope.go:117] "RemoveContainer" containerID="9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf" Dec 16 15:16:45 crc kubenswrapper[4728]: E1216 15:16:45.403202 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf\": container with ID starting with 9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf not found: ID does not exist" containerID="9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.403222 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf"} err="failed to get container status \"9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf\": rpc error: code = NotFound desc = could not find container \"9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf\": container with ID starting with 9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf not found: ID does not exist" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.403243 4728 scope.go:117] "RemoveContainer" containerID="45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff" Dec 16 15:16:45 crc kubenswrapper[4728]: E1216 15:16:45.403739 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff\": container with ID starting with 45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff not found: ID does not exist" containerID="45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.403757 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff"} err="failed to get container status \"45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff\": rpc error: code = NotFound desc = could not find container \"45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff\": container with ID starting with 45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff not found: ID does not exist" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.403770 4728 scope.go:117] "RemoveContainer" containerID="9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.403994 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf"} err="failed to get container status \"9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf\": rpc error: code = NotFound desc = could not find container \"9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf\": container with ID starting with 9b6e9aa9b0c18cef15b1e80792856b46c0b67b4e225cdc4aed4a5fc0013a8bdf not found: ID does not exist" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.404008 4728 scope.go:117] "RemoveContainer" containerID="45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.404319 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff"} err="failed to get container status \"45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff\": rpc error: code = NotFound desc = could not find container \"45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff\": container with ID starting with 45224000a8a595503e2eeb916bfe993a867e481f2e6076ae9fb283a228519dff not found: ID does not exist" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.408877 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3139eb80-61ac-456d-9834-707a65c6ff98" (UID: "3139eb80-61ac-456d-9834-707a65c6ff98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.416631 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-config-data" (OuterVolumeSpecName: "config-data") pod "3139eb80-61ac-456d-9834-707a65c6ff98" (UID: "3139eb80-61ac-456d-9834-707a65c6ff98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.482513 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdt95\" (UniqueName: \"kubernetes.io/projected/3139eb80-61ac-456d-9834-707a65c6ff98-kube-api-access-wdt95\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.482551 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.482561 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.494556 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3139eb80-61ac-456d-9834-707a65c6ff98" (UID: "3139eb80-61ac-456d-9834-707a65c6ff98"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.584318 4728 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3139eb80-61ac-456d-9834-707a65c6ff98-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.750526 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.760229 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.767811 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:45 crc kubenswrapper[4728]: E1216 15:16:45.768275 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3139eb80-61ac-456d-9834-707a65c6ff98" containerName="nova-metadata-metadata" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.768298 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3139eb80-61ac-456d-9834-707a65c6ff98" containerName="nova-metadata-metadata" Dec 16 15:16:45 crc kubenswrapper[4728]: E1216 15:16:45.768316 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae8087e-b189-4db5-b646-d421f63a4828" containerName="dnsmasq-dns" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.768325 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae8087e-b189-4db5-b646-d421f63a4828" containerName="dnsmasq-dns" Dec 16 15:16:45 crc kubenswrapper[4728]: E1216 15:16:45.768343 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23498e4-9faf-4b56-9d5b-89a616514d12" containerName="nova-manage" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.768352 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23498e4-9faf-4b56-9d5b-89a616514d12" containerName="nova-manage" Dec 16 15:16:45 crc kubenswrapper[4728]: E1216 15:16:45.768363 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82e422bb-b40d-4ccf-a882-e6b8ea94cc70" containerName="nova-scheduler-scheduler" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.768371 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e422bb-b40d-4ccf-a882-e6b8ea94cc70" containerName="nova-scheduler-scheduler" Dec 16 15:16:45 crc kubenswrapper[4728]: E1216 15:16:45.768389 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3139eb80-61ac-456d-9834-707a65c6ff98" containerName="nova-metadata-log" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.768397 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3139eb80-61ac-456d-9834-707a65c6ff98" containerName="nova-metadata-log" Dec 16 15:16:45 crc kubenswrapper[4728]: E1216 15:16:45.768469 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae8087e-b189-4db5-b646-d421f63a4828" containerName="init" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.768478 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae8087e-b189-4db5-b646-d421f63a4828" containerName="init" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.768690 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="3139eb80-61ac-456d-9834-707a65c6ff98" containerName="nova-metadata-metadata" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.768704 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23498e4-9faf-4b56-9d5b-89a616514d12" containerName="nova-manage" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.768721 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="82e422bb-b40d-4ccf-a882-e6b8ea94cc70" containerName="nova-scheduler-scheduler" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.768738 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae8087e-b189-4db5-b646-d421f63a4828" containerName="dnsmasq-dns" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.768755 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="3139eb80-61ac-456d-9834-707a65c6ff98" containerName="nova-metadata-log" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.769872 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.773423 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.773611 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.787208 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxqf8\" (UniqueName: \"kubernetes.io/projected/3c71d61c-fde8-4ba1-a572-aac714b424fe-kube-api-access-wxqf8\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.787295 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-config-data\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.787342 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.787379 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c71d61c-fde8-4ba1-a572-aac714b424fe-logs\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.787421 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.787582 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.802154 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.809535 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.818937 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.820147 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.827187 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.850907 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.892220 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c71d61c-fde8-4ba1-a572-aac714b424fe-logs\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.892272 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-config-data\") pod \"nova-scheduler-0\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.892303 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.892391 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nndn\" (UniqueName: \"kubernetes.io/projected/a27a5934-36e6-4c83-add1-e362af6bf332-kube-api-access-2nndn\") pod \"nova-scheduler-0\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.892480 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.892529 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxqf8\" (UniqueName: \"kubernetes.io/projected/3c71d61c-fde8-4ba1-a572-aac714b424fe-kube-api-access-wxqf8\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.892635 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-config-data\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.892706 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c71d61c-fde8-4ba1-a572-aac714b424fe-logs\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.892714 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.906109 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.909175 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.911570 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-config-data\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.913107 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxqf8\" (UniqueName: \"kubernetes.io/projected/3c71d61c-fde8-4ba1-a572-aac714b424fe-kube-api-access-wxqf8\") pod \"nova-metadata-0\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " pod="openstack/nova-metadata-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.977030 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.996505 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-combined-ca-bundle\") pod \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.996746 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p4rz\" (UniqueName: \"kubernetes.io/projected/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-kube-api-access-5p4rz\") pod \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.996941 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-config-data\") pod \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.997110 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-logs\") pod \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\" (UID: \"9545cf88-59bd-4e97-9eb2-25b59b0f3db1\") " Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.997422 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-config-data\") pod \"nova-scheduler-0\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.997555 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nndn\" (UniqueName: \"kubernetes.io/projected/a27a5934-36e6-4c83-add1-e362af6bf332-kube-api-access-2nndn\") pod \"nova-scheduler-0\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:45 crc kubenswrapper[4728]: I1216 15:16:45.997670 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.000039 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-logs" (OuterVolumeSpecName: "logs") pod "9545cf88-59bd-4e97-9eb2-25b59b0f3db1" (UID: "9545cf88-59bd-4e97-9eb2-25b59b0f3db1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.002075 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.002961 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-config-data\") pod \"nova-scheduler-0\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.004618 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-kube-api-access-5p4rz" (OuterVolumeSpecName: "kube-api-access-5p4rz") pod "9545cf88-59bd-4e97-9eb2-25b59b0f3db1" (UID: "9545cf88-59bd-4e97-9eb2-25b59b0f3db1"). InnerVolumeSpecName "kube-api-access-5p4rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.021106 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nndn\" (UniqueName: \"kubernetes.io/projected/a27a5934-36e6-4c83-add1-e362af6bf332-kube-api-access-2nndn\") pod \"nova-scheduler-0\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " pod="openstack/nova-scheduler-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.027495 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9545cf88-59bd-4e97-9eb2-25b59b0f3db1" (UID: "9545cf88-59bd-4e97-9eb2-25b59b0f3db1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.040943 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-config-data" (OuterVolumeSpecName: "config-data") pod "9545cf88-59bd-4e97-9eb2-25b59b0f3db1" (UID: "9545cf88-59bd-4e97-9eb2-25b59b0f3db1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.096523 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.099860 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.099891 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.099903 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.099914 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p4rz\" (UniqueName: \"kubernetes.io/projected/9545cf88-59bd-4e97-9eb2-25b59b0f3db1-kube-api-access-5p4rz\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.191154 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.345973 4728 generic.go:334] "Generic (PLEG): container finished" podID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerID="fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f" exitCode=0 Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.346031 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9545cf88-59bd-4e97-9eb2-25b59b0f3db1","Type":"ContainerDied","Data":"fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f"} Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.346056 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9545cf88-59bd-4e97-9eb2-25b59b0f3db1","Type":"ContainerDied","Data":"311e470bd06f0075dfe81e35da2b2ec49face52972d48215635de442d7a63554"} Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.346072 4728 scope.go:117] "RemoveContainer" containerID="fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.346199 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.391630 4728 scope.go:117] "RemoveContainer" containerID="c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.418896 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.433448 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.443895 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 15:16:46 crc kubenswrapper[4728]: E1216 15:16:46.444504 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerName="nova-api-log" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.444529 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerName="nova-api-log" Dec 16 15:16:46 crc kubenswrapper[4728]: E1216 15:16:46.444563 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerName="nova-api-api" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.444572 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerName="nova-api-api" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.444773 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerName="nova-api-log" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.444790 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" containerName="nova-api-api" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.445937 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.448545 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.459265 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.481670 4728 scope.go:117] "RemoveContainer" containerID="fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f" Dec 16 15:16:46 crc kubenswrapper[4728]: E1216 15:16:46.482156 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f\": container with ID starting with fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f not found: ID does not exist" containerID="fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.482198 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f"} err="failed to get container status \"fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f\": rpc error: code = NotFound desc = could not find container \"fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f\": container with ID starting with fb1625af53eeb4eccdce0c88b9d485a231c616cac49e05239b6deda0a2ebfd8f not found: ID does not exist" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.482228 4728 scope.go:117] "RemoveContainer" containerID="c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989" Dec 16 15:16:46 crc kubenswrapper[4728]: E1216 15:16:46.482889 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989\": container with ID starting with c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989 not found: ID does not exist" containerID="c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.482920 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989"} err="failed to get container status \"c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989\": rpc error: code = NotFound desc = could not find container \"c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989\": container with ID starting with c1b25b63e9e3d9517ed70a936c7a58dce311977adbc48d19b3cea902e46d7989 not found: ID does not exist" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.514785 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-config-data\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.514876 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r92j\" (UniqueName: \"kubernetes.io/projected/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-kube-api-access-6r92j\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.515076 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-logs\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.515293 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.534868 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.616974 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-config-data\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.617038 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r92j\" (UniqueName: \"kubernetes.io/projected/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-kube-api-access-6r92j\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.617127 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-logs\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.617234 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.617783 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-logs\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.622936 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-config-data\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.623510 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.632160 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r92j\" (UniqueName: \"kubernetes.io/projected/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-kube-api-access-6r92j\") pod \"nova-api-0\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " pod="openstack/nova-api-0" Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.732126 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:16:46 crc kubenswrapper[4728]: W1216 15:16:46.736330 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda27a5934_36e6_4c83_add1_e362af6bf332.slice/crio-56836d2f94bda55823de6c934ae39510aba0782b0b770f0c7227433d64cbd917 WatchSource:0}: Error finding container 56836d2f94bda55823de6c934ae39510aba0782b0b770f0c7227433d64cbd917: Status 404 returned error can't find the container with id 56836d2f94bda55823de6c934ae39510aba0782b0b770f0c7227433d64cbd917 Dec 16 15:16:46 crc kubenswrapper[4728]: I1216 15:16:46.806217 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.330704 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:16:47 crc kubenswrapper[4728]: W1216 15:16:47.332703 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea4dd890_0d1f_4db3_adb6_2b4422cc9d49.slice/crio-1c16b6d4e240ecf1af42a645f0c20b8296ba354853791afea31ea0bbba23c5db WatchSource:0}: Error finding container 1c16b6d4e240ecf1af42a645f0c20b8296ba354853791afea31ea0bbba23c5db: Status 404 returned error can't find the container with id 1c16b6d4e240ecf1af42a645f0c20b8296ba354853791afea31ea0bbba23c5db Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.381031 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a27a5934-36e6-4c83-add1-e362af6bf332","Type":"ContainerStarted","Data":"a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762"} Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.381088 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a27a5934-36e6-4c83-add1-e362af6bf332","Type":"ContainerStarted","Data":"56836d2f94bda55823de6c934ae39510aba0782b0b770f0c7227433d64cbd917"} Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.382829 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c71d61c-fde8-4ba1-a572-aac714b424fe","Type":"ContainerStarted","Data":"84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4"} Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.382862 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c71d61c-fde8-4ba1-a572-aac714b424fe","Type":"ContainerStarted","Data":"9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116"} Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.382880 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c71d61c-fde8-4ba1-a572-aac714b424fe","Type":"ContainerStarted","Data":"ed48f383fe6c5222f6517c9eeb4ef4aab6f470f1cab31bc65c4123a33182fbc0"} Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.385476 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49","Type":"ContainerStarted","Data":"1c16b6d4e240ecf1af42a645f0c20b8296ba354853791afea31ea0bbba23c5db"} Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.406253 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.406230518 podStartE2EDuration="2.406230518s" podCreationTimestamp="2025-12-16 15:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:47.396776731 +0000 UTC m=+1188.236955715" watchObservedRunningTime="2025-12-16 15:16:47.406230518 +0000 UTC m=+1188.246409502" Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.419863 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.419838997 podStartE2EDuration="2.419838997s" podCreationTimestamp="2025-12-16 15:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:47.41108111 +0000 UTC m=+1188.251260134" watchObservedRunningTime="2025-12-16 15:16:47.419838997 +0000 UTC m=+1188.260018011" Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.517173 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3139eb80-61ac-456d-9834-707a65c6ff98" path="/var/lib/kubelet/pods/3139eb80-61ac-456d-9834-707a65c6ff98/volumes" Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.517873 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82e422bb-b40d-4ccf-a882-e6b8ea94cc70" path="/var/lib/kubelet/pods/82e422bb-b40d-4ccf-a882-e6b8ea94cc70/volumes" Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.518465 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9545cf88-59bd-4e97-9eb2-25b59b0f3db1" path="/var/lib/kubelet/pods/9545cf88-59bd-4e97-9eb2-25b59b0f3db1/volumes" Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.683802 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:16:47 crc kubenswrapper[4728]: I1216 15:16:47.684050 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="897f23b2-ad11-44ed-b0d2-623529b5e559" containerName="kube-state-metrics" containerID="cri-o://b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12" gracePeriod=30 Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.154883 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.270906 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg6ms\" (UniqueName: \"kubernetes.io/projected/897f23b2-ad11-44ed-b0d2-623529b5e559-kube-api-access-gg6ms\") pod \"897f23b2-ad11-44ed-b0d2-623529b5e559\" (UID: \"897f23b2-ad11-44ed-b0d2-623529b5e559\") " Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.274623 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/897f23b2-ad11-44ed-b0d2-623529b5e559-kube-api-access-gg6ms" (OuterVolumeSpecName: "kube-api-access-gg6ms") pod "897f23b2-ad11-44ed-b0d2-623529b5e559" (UID: "897f23b2-ad11-44ed-b0d2-623529b5e559"). InnerVolumeSpecName "kube-api-access-gg6ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.373017 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg6ms\" (UniqueName: \"kubernetes.io/projected/897f23b2-ad11-44ed-b0d2-623529b5e559-kube-api-access-gg6ms\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.396735 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49","Type":"ContainerStarted","Data":"961ae57e2d1eab5c2805e7fc1df484744e69ffaaea8745145761363a9770965f"} Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.396800 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49","Type":"ContainerStarted","Data":"86ed6566693bb505ad5447b473122f307aa6bc23a30cd06c77c6fec0d3aa1783"} Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.399262 4728 generic.go:334] "Generic (PLEG): container finished" podID="897f23b2-ad11-44ed-b0d2-623529b5e559" containerID="b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12" exitCode=2 Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.399319 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"897f23b2-ad11-44ed-b0d2-623529b5e559","Type":"ContainerDied","Data":"b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12"} Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.399350 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.399382 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"897f23b2-ad11-44ed-b0d2-623529b5e559","Type":"ContainerDied","Data":"6e898a7cef57539d74dedeffe1ddec0fc9548245e611bb4f2dead9058509a8f3"} Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.399435 4728 scope.go:117] "RemoveContainer" containerID="b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.428016 4728 scope.go:117] "RemoveContainer" containerID="b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.429790 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.429763152 podStartE2EDuration="2.429763152s" podCreationTimestamp="2025-12-16 15:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:48.428662402 +0000 UTC m=+1189.268841386" watchObservedRunningTime="2025-12-16 15:16:48.429763152 +0000 UTC m=+1189.269942146" Dec 16 15:16:48 crc kubenswrapper[4728]: E1216 15:16:48.430867 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12\": container with ID starting with b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12 not found: ID does not exist" containerID="b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.430909 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12"} err="failed to get container status \"b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12\": rpc error: code = NotFound desc = could not find container \"b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12\": container with ID starting with b31cfd4673257c1f28a93971e8ed248bf3b5ca5a4c4844eaabaef4a207cc2d12 not found: ID does not exist" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.458239 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.479054 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.491391 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:16:48 crc kubenswrapper[4728]: E1216 15:16:48.491955 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897f23b2-ad11-44ed-b0d2-623529b5e559" containerName="kube-state-metrics" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.491994 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="897f23b2-ad11-44ed-b0d2-623529b5e559" containerName="kube-state-metrics" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.492235 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="897f23b2-ad11-44ed-b0d2-623529b5e559" containerName="kube-state-metrics" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.493001 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.495803 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.496288 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.507425 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.576528 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6bff3-10b0-4969-b7ef-f31cee80091d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.576568 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca6bff3-10b0-4969-b7ef-f31cee80091d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.576617 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjhbp\" (UniqueName: \"kubernetes.io/projected/dca6bff3-10b0-4969-b7ef-f31cee80091d-kube-api-access-bjhbp\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.576711 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/dca6bff3-10b0-4969-b7ef-f31cee80091d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.678244 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6bff3-10b0-4969-b7ef-f31cee80091d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.679072 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca6bff3-10b0-4969-b7ef-f31cee80091d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.679139 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjhbp\" (UniqueName: \"kubernetes.io/projected/dca6bff3-10b0-4969-b7ef-f31cee80091d-kube-api-access-bjhbp\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.679218 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/dca6bff3-10b0-4969-b7ef-f31cee80091d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.689095 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6bff3-10b0-4969-b7ef-f31cee80091d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.689443 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/dca6bff3-10b0-4969-b7ef-f31cee80091d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.696154 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca6bff3-10b0-4969-b7ef-f31cee80091d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.696564 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjhbp\" (UniqueName: \"kubernetes.io/projected/dca6bff3-10b0-4969-b7ef-f31cee80091d-kube-api-access-bjhbp\") pod \"kube-state-metrics-0\" (UID: \"dca6bff3-10b0-4969-b7ef-f31cee80091d\") " pod="openstack/kube-state-metrics-0" Dec 16 15:16:48 crc kubenswrapper[4728]: I1216 15:16:48.820817 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 16 15:16:49 crc kubenswrapper[4728]: I1216 15:16:49.294195 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 16 15:16:49 crc kubenswrapper[4728]: W1216 15:16:49.295191 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddca6bff3_10b0_4969_b7ef_f31cee80091d.slice/crio-392183490a1c5c367b6838d09aacd8e98178c8979c350fb8726792a061d70f70 WatchSource:0}: Error finding container 392183490a1c5c367b6838d09aacd8e98178c8979c350fb8726792a061d70f70: Status 404 returned error can't find the container with id 392183490a1c5c367b6838d09aacd8e98178c8979c350fb8726792a061d70f70 Dec 16 15:16:49 crc kubenswrapper[4728]: I1216 15:16:49.407816 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dca6bff3-10b0-4969-b7ef-f31cee80091d","Type":"ContainerStarted","Data":"392183490a1c5c367b6838d09aacd8e98178c8979c350fb8726792a061d70f70"} Dec 16 15:16:49 crc kubenswrapper[4728]: I1216 15:16:49.409296 4728 generic.go:334] "Generic (PLEG): container finished" podID="67785338-b264-4d86-b1b5-6ca4248d938f" containerID="9434c9173fb0736f4e65b34da0a27088919248ae8bcafea8cb472449cda9bf4e" exitCode=0 Dec 16 15:16:49 crc kubenswrapper[4728]: I1216 15:16:49.409333 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fkg4z" event={"ID":"67785338-b264-4d86-b1b5-6ca4248d938f","Type":"ContainerDied","Data":"9434c9173fb0736f4e65b34da0a27088919248ae8bcafea8cb472449cda9bf4e"} Dec 16 15:16:49 crc kubenswrapper[4728]: I1216 15:16:49.461732 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:49 crc kubenswrapper[4728]: I1216 15:16:49.462056 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="ceilometer-central-agent" containerID="cri-o://6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682" gracePeriod=30 Dec 16 15:16:49 crc kubenswrapper[4728]: I1216 15:16:49.462086 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="sg-core" containerID="cri-o://86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01" gracePeriod=30 Dec 16 15:16:49 crc kubenswrapper[4728]: I1216 15:16:49.462181 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="proxy-httpd" containerID="cri-o://a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432" gracePeriod=30 Dec 16 15:16:49 crc kubenswrapper[4728]: I1216 15:16:49.462247 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="ceilometer-notification-agent" containerID="cri-o://3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486" gracePeriod=30 Dec 16 15:16:49 crc kubenswrapper[4728]: I1216 15:16:49.522827 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="897f23b2-ad11-44ed-b0d2-623529b5e559" path="/var/lib/kubelet/pods/897f23b2-ad11-44ed-b0d2-623529b5e559/volumes" Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.430505 4728 generic.go:334] "Generic (PLEG): container finished" podID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerID="a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432" exitCode=0 Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.431143 4728 generic.go:334] "Generic (PLEG): container finished" podID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerID="86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01" exitCode=2 Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.431156 4728 generic.go:334] "Generic (PLEG): container finished" podID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerID="6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682" exitCode=0 Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.431213 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a410ae7-1999-4ce1-a2bb-94158e1b917e","Type":"ContainerDied","Data":"a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432"} Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.431242 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a410ae7-1999-4ce1-a2bb-94158e1b917e","Type":"ContainerDied","Data":"86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01"} Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.431252 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a410ae7-1999-4ce1-a2bb-94158e1b917e","Type":"ContainerDied","Data":"6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682"} Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.433109 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dca6bff3-10b0-4969-b7ef-f31cee80091d","Type":"ContainerStarted","Data":"3f0231122ee8929650cb7127f280af54e6be0c9abe85d037d20a3a7540193dbc"} Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.461864 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.117882007 podStartE2EDuration="2.461842844s" podCreationTimestamp="2025-12-16 15:16:48 +0000 UTC" firstStartedPulling="2025-12-16 15:16:49.297528394 +0000 UTC m=+1190.137707378" lastFinishedPulling="2025-12-16 15:16:49.641489231 +0000 UTC m=+1190.481668215" observedRunningTime="2025-12-16 15:16:50.45213629 +0000 UTC m=+1191.292315284" watchObservedRunningTime="2025-12-16 15:16:50.461842844 +0000 UTC m=+1191.302021828" Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.773423 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.822680 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qk7h\" (UniqueName: \"kubernetes.io/projected/67785338-b264-4d86-b1b5-6ca4248d938f-kube-api-access-9qk7h\") pod \"67785338-b264-4d86-b1b5-6ca4248d938f\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.823051 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-scripts\") pod \"67785338-b264-4d86-b1b5-6ca4248d938f\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.823090 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-combined-ca-bundle\") pod \"67785338-b264-4d86-b1b5-6ca4248d938f\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.823273 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-config-data\") pod \"67785338-b264-4d86-b1b5-6ca4248d938f\" (UID: \"67785338-b264-4d86-b1b5-6ca4248d938f\") " Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.829219 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67785338-b264-4d86-b1b5-6ca4248d938f-kube-api-access-9qk7h" (OuterVolumeSpecName: "kube-api-access-9qk7h") pod "67785338-b264-4d86-b1b5-6ca4248d938f" (UID: "67785338-b264-4d86-b1b5-6ca4248d938f"). InnerVolumeSpecName "kube-api-access-9qk7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.829272 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-scripts" (OuterVolumeSpecName: "scripts") pod "67785338-b264-4d86-b1b5-6ca4248d938f" (UID: "67785338-b264-4d86-b1b5-6ca4248d938f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.847926 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-config-data" (OuterVolumeSpecName: "config-data") pod "67785338-b264-4d86-b1b5-6ca4248d938f" (UID: "67785338-b264-4d86-b1b5-6ca4248d938f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.859718 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67785338-b264-4d86-b1b5-6ca4248d938f" (UID: "67785338-b264-4d86-b1b5-6ca4248d938f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.924990 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qk7h\" (UniqueName: \"kubernetes.io/projected/67785338-b264-4d86-b1b5-6ca4248d938f-kube-api-access-9qk7h\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.925020 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.925031 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:50 crc kubenswrapper[4728]: I1216 15:16:50.925041 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67785338-b264-4d86-b1b5-6ca4248d938f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.097173 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.097465 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.192622 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.444958 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fkg4z" event={"ID":"67785338-b264-4d86-b1b5-6ca4248d938f","Type":"ContainerDied","Data":"f8d15615f6b792fe02a5897a56834f724b5d4405890176b86fee7e4273a2c4a8"} Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.445022 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8d15615f6b792fe02a5897a56834f724b5d4405890176b86fee7e4273a2c4a8" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.445151 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.445279 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fkg4z" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.516869 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 15:16:51 crc kubenswrapper[4728]: E1216 15:16:51.517256 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67785338-b264-4d86-b1b5-6ca4248d938f" containerName="nova-cell1-conductor-db-sync" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.517282 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="67785338-b264-4d86-b1b5-6ca4248d938f" containerName="nova-cell1-conductor-db-sync" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.517619 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="67785338-b264-4d86-b1b5-6ca4248d938f" containerName="nova-cell1-conductor-db-sync" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.518325 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.522249 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.523752 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.643665 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/170e3d88-1e9a-4e6b-aead-ced16b98610e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"170e3d88-1e9a-4e6b-aead-ced16b98610e\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.644125 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4krnk\" (UniqueName: \"kubernetes.io/projected/170e3d88-1e9a-4e6b-aead-ced16b98610e-kube-api-access-4krnk\") pod \"nova-cell1-conductor-0\" (UID: \"170e3d88-1e9a-4e6b-aead-ced16b98610e\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.644222 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/170e3d88-1e9a-4e6b-aead-ced16b98610e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"170e3d88-1e9a-4e6b-aead-ced16b98610e\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.746310 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4krnk\" (UniqueName: \"kubernetes.io/projected/170e3d88-1e9a-4e6b-aead-ced16b98610e-kube-api-access-4krnk\") pod \"nova-cell1-conductor-0\" (UID: \"170e3d88-1e9a-4e6b-aead-ced16b98610e\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.746798 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/170e3d88-1e9a-4e6b-aead-ced16b98610e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"170e3d88-1e9a-4e6b-aead-ced16b98610e\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.747792 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/170e3d88-1e9a-4e6b-aead-ced16b98610e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"170e3d88-1e9a-4e6b-aead-ced16b98610e\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.761382 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/170e3d88-1e9a-4e6b-aead-ced16b98610e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"170e3d88-1e9a-4e6b-aead-ced16b98610e\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.762666 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/170e3d88-1e9a-4e6b-aead-ced16b98610e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"170e3d88-1e9a-4e6b-aead-ced16b98610e\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.768083 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4krnk\" (UniqueName: \"kubernetes.io/projected/170e3d88-1e9a-4e6b-aead-ced16b98610e-kube-api-access-4krnk\") pod \"nova-cell1-conductor-0\" (UID: \"170e3d88-1e9a-4e6b-aead-ced16b98610e\") " pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.841946 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:51 crc kubenswrapper[4728]: I1216 15:16:51.993113 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.054578 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-config-data\") pod \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.054738 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-sg-core-conf-yaml\") pod \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.054822 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-run-httpd\") pod \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.054877 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjjbt\" (UniqueName: \"kubernetes.io/projected/8a410ae7-1999-4ce1-a2bb-94158e1b917e-kube-api-access-xjjbt\") pod \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.054910 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-combined-ca-bundle\") pod \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.054971 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-scripts\") pod \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.054992 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-log-httpd\") pod \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\" (UID: \"8a410ae7-1999-4ce1-a2bb-94158e1b917e\") " Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.062879 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a410ae7-1999-4ce1-a2bb-94158e1b917e" (UID: "8a410ae7-1999-4ce1-a2bb-94158e1b917e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.063341 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a410ae7-1999-4ce1-a2bb-94158e1b917e" (UID: "8a410ae7-1999-4ce1-a2bb-94158e1b917e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.066180 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a410ae7-1999-4ce1-a2bb-94158e1b917e-kube-api-access-xjjbt" (OuterVolumeSpecName: "kube-api-access-xjjbt") pod "8a410ae7-1999-4ce1-a2bb-94158e1b917e" (UID: "8a410ae7-1999-4ce1-a2bb-94158e1b917e"). InnerVolumeSpecName "kube-api-access-xjjbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.076592 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-scripts" (OuterVolumeSpecName: "scripts") pod "8a410ae7-1999-4ce1-a2bb-94158e1b917e" (UID: "8a410ae7-1999-4ce1-a2bb-94158e1b917e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.097571 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a410ae7-1999-4ce1-a2bb-94158e1b917e" (UID: "8a410ae7-1999-4ce1-a2bb-94158e1b917e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.156695 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.156740 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.156756 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjjbt\" (UniqueName: \"kubernetes.io/projected/8a410ae7-1999-4ce1-a2bb-94158e1b917e-kube-api-access-xjjbt\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.156772 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.156783 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a410ae7-1999-4ce1-a2bb-94158e1b917e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.162333 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a410ae7-1999-4ce1-a2bb-94158e1b917e" (UID: "8a410ae7-1999-4ce1-a2bb-94158e1b917e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.189424 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-config-data" (OuterVolumeSpecName: "config-data") pod "8a410ae7-1999-4ce1-a2bb-94158e1b917e" (UID: "8a410ae7-1999-4ce1-a2bb-94158e1b917e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.258801 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.258838 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a410ae7-1999-4ce1-a2bb-94158e1b917e-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.455714 4728 generic.go:334] "Generic (PLEG): container finished" podID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerID="3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486" exitCode=0 Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.455771 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.455836 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a410ae7-1999-4ce1-a2bb-94158e1b917e","Type":"ContainerDied","Data":"3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486"} Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.455875 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a410ae7-1999-4ce1-a2bb-94158e1b917e","Type":"ContainerDied","Data":"52b7ab2ca84ae77fc776add517dba339c064ef74a55948372e3fe66584cded91"} Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.455896 4728 scope.go:117] "RemoveContainer" containerID="a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.485647 4728 scope.go:117] "RemoveContainer" containerID="86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.511855 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.519306 4728 scope.go:117] "RemoveContainer" containerID="3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.520668 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.548975 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.574524 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:52 crc kubenswrapper[4728]: E1216 15:16:52.574877 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="ceilometer-central-agent" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.574893 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="ceilometer-central-agent" Dec 16 15:16:52 crc kubenswrapper[4728]: E1216 15:16:52.574909 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="sg-core" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.574914 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="sg-core" Dec 16 15:16:52 crc kubenswrapper[4728]: E1216 15:16:52.574933 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="ceilometer-notification-agent" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.574939 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="ceilometer-notification-agent" Dec 16 15:16:52 crc kubenswrapper[4728]: E1216 15:16:52.574955 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="proxy-httpd" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.574951 4728 scope.go:117] "RemoveContainer" containerID="6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.574960 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="proxy-httpd" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.575221 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="proxy-httpd" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.575243 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="ceilometer-notification-agent" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.575257 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="sg-core" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.575270 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" containerName="ceilometer-central-agent" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.576813 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.579439 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.579557 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.579598 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.589332 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.615524 4728 scope.go:117] "RemoveContainer" containerID="a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432" Dec 16 15:16:52 crc kubenswrapper[4728]: E1216 15:16:52.616053 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432\": container with ID starting with a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432 not found: ID does not exist" containerID="a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.616108 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432"} err="failed to get container status \"a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432\": rpc error: code = NotFound desc = could not find container \"a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432\": container with ID starting with a1809caf19e2f58a9817dda817c7749ceaeb098aedd09d2ff9a8abb461479432 not found: ID does not exist" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.616146 4728 scope.go:117] "RemoveContainer" containerID="86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01" Dec 16 15:16:52 crc kubenswrapper[4728]: E1216 15:16:52.616766 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01\": container with ID starting with 86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01 not found: ID does not exist" containerID="86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.616808 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01"} err="failed to get container status \"86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01\": rpc error: code = NotFound desc = could not find container \"86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01\": container with ID starting with 86b377ea5a27c9091345e672a49cd5ee9de7084baaffa8be829d7ddadca09e01 not found: ID does not exist" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.616838 4728 scope.go:117] "RemoveContainer" containerID="3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486" Dec 16 15:16:52 crc kubenswrapper[4728]: E1216 15:16:52.617140 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486\": container with ID starting with 3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486 not found: ID does not exist" containerID="3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.617172 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486"} err="failed to get container status \"3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486\": rpc error: code = NotFound desc = could not find container \"3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486\": container with ID starting with 3fe2c0c510829fe0956bf3126105fab88f331e728779f81163267e271785d486 not found: ID does not exist" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.617190 4728 scope.go:117] "RemoveContainer" containerID="6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682" Dec 16 15:16:52 crc kubenswrapper[4728]: E1216 15:16:52.617509 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682\": container with ID starting with 6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682 not found: ID does not exist" containerID="6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.617566 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682"} err="failed to get container status \"6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682\": rpc error: code = NotFound desc = could not find container \"6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682\": container with ID starting with 6c474d28703cba00962a95f7bbb00215d92ae5183687f881c56783d02741b682 not found: ID does not exist" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.674708 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.674757 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-config-data\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.674791 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.674822 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-log-httpd\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.674864 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-scripts\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.674891 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-run-httpd\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.674912 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc8g7\" (UniqueName: \"kubernetes.io/projected/6e979bde-daa7-4707-8198-8d306ff07e3f-kube-api-access-wc8g7\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.674929 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.776940 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-scripts\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.777260 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-run-httpd\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.777281 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc8g7\" (UniqueName: \"kubernetes.io/projected/6e979bde-daa7-4707-8198-8d306ff07e3f-kube-api-access-wc8g7\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.777300 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.777424 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.777449 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-config-data\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.777484 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.777528 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-log-httpd\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.777806 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-run-httpd\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.777929 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-log-httpd\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.782466 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.785616 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-config-data\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.786051 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-scripts\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.786716 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.789279 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.797119 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc8g7\" (UniqueName: \"kubernetes.io/projected/6e979bde-daa7-4707-8198-8d306ff07e3f-kube-api-access-wc8g7\") pod \"ceilometer-0\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " pod="openstack/ceilometer-0" Dec 16 15:16:52 crc kubenswrapper[4728]: I1216 15:16:52.898207 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:16:53 crc kubenswrapper[4728]: I1216 15:16:53.404889 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:16:53 crc kubenswrapper[4728]: W1216 15:16:53.406355 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e979bde_daa7_4707_8198_8d306ff07e3f.slice/crio-0e61ae102a3477e6706cc97b14de91fc5aa5ce8d8c5459ceb9fed720a66bea4d WatchSource:0}: Error finding container 0e61ae102a3477e6706cc97b14de91fc5aa5ce8d8c5459ceb9fed720a66bea4d: Status 404 returned error can't find the container with id 0e61ae102a3477e6706cc97b14de91fc5aa5ce8d8c5459ceb9fed720a66bea4d Dec 16 15:16:53 crc kubenswrapper[4728]: I1216 15:16:53.408912 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:16:53 crc kubenswrapper[4728]: I1216 15:16:53.467370 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e979bde-daa7-4707-8198-8d306ff07e3f","Type":"ContainerStarted","Data":"0e61ae102a3477e6706cc97b14de91fc5aa5ce8d8c5459ceb9fed720a66bea4d"} Dec 16 15:16:53 crc kubenswrapper[4728]: I1216 15:16:53.468694 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"170e3d88-1e9a-4e6b-aead-ced16b98610e","Type":"ContainerStarted","Data":"9ffa5226d4d73eac5795822bea7b8a9473a231e2860b68da5dd47f8ccea65ea3"} Dec 16 15:16:53 crc kubenswrapper[4728]: I1216 15:16:53.468723 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"170e3d88-1e9a-4e6b-aead-ced16b98610e","Type":"ContainerStarted","Data":"34d105e005b19b3b6752c0752393bf559f56444fdf550530c1f52974a469031b"} Dec 16 15:16:53 crc kubenswrapper[4728]: I1216 15:16:53.469472 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 16 15:16:53 crc kubenswrapper[4728]: I1216 15:16:53.491737 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.491715601 podStartE2EDuration="2.491715601s" podCreationTimestamp="2025-12-16 15:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:16:53.487308012 +0000 UTC m=+1194.327486996" watchObservedRunningTime="2025-12-16 15:16:53.491715601 +0000 UTC m=+1194.331894585" Dec 16 15:16:53 crc kubenswrapper[4728]: I1216 15:16:53.515616 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a410ae7-1999-4ce1-a2bb-94158e1b917e" path="/var/lib/kubelet/pods/8a410ae7-1999-4ce1-a2bb-94158e1b917e/volumes" Dec 16 15:16:55 crc kubenswrapper[4728]: I1216 15:16:55.490448 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e979bde-daa7-4707-8198-8d306ff07e3f","Type":"ContainerStarted","Data":"70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21"} Dec 16 15:16:56 crc kubenswrapper[4728]: I1216 15:16:56.098089 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 15:16:56 crc kubenswrapper[4728]: I1216 15:16:56.098474 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 15:16:56 crc kubenswrapper[4728]: I1216 15:16:56.192522 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 15:16:56 crc kubenswrapper[4728]: I1216 15:16:56.223741 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 15:16:56 crc kubenswrapper[4728]: I1216 15:16:56.503039 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e979bde-daa7-4707-8198-8d306ff07e3f","Type":"ContainerStarted","Data":"1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641"} Dec 16 15:16:56 crc kubenswrapper[4728]: I1216 15:16:56.538309 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 15:16:56 crc kubenswrapper[4728]: I1216 15:16:56.807076 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:16:56 crc kubenswrapper[4728]: I1216 15:16:56.807795 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:16:57 crc kubenswrapper[4728]: I1216 15:16:57.113569 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:16:57 crc kubenswrapper[4728]: I1216 15:16:57.113594 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:16:57 crc kubenswrapper[4728]: I1216 15:16:57.521252 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e979bde-daa7-4707-8198-8d306ff07e3f","Type":"ContainerStarted","Data":"ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e"} Dec 16 15:16:57 crc kubenswrapper[4728]: I1216 15:16:57.889664 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 15:16:57 crc kubenswrapper[4728]: I1216 15:16:57.889663 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 16 15:16:58 crc kubenswrapper[4728]: I1216 15:16:58.831650 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 16 15:16:59 crc kubenswrapper[4728]: I1216 15:16:59.542681 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e979bde-daa7-4707-8198-8d306ff07e3f","Type":"ContainerStarted","Data":"190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a"} Dec 16 15:16:59 crc kubenswrapper[4728]: I1216 15:16:59.543013 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:16:59 crc kubenswrapper[4728]: I1216 15:16:59.603677 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.742535072 podStartE2EDuration="7.603657174s" podCreationTimestamp="2025-12-16 15:16:52 +0000 UTC" firstStartedPulling="2025-12-16 15:16:53.408691775 +0000 UTC m=+1194.248870759" lastFinishedPulling="2025-12-16 15:16:58.269813877 +0000 UTC m=+1199.109992861" observedRunningTime="2025-12-16 15:16:59.591997097 +0000 UTC m=+1200.432176071" watchObservedRunningTime="2025-12-16 15:16:59.603657174 +0000 UTC m=+1200.443836158" Dec 16 15:17:01 crc kubenswrapper[4728]: I1216 15:17:01.882167 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 16 15:17:06 crc kubenswrapper[4728]: I1216 15:17:06.104716 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 15:17:06 crc kubenswrapper[4728]: I1216 15:17:06.105147 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 15:17:06 crc kubenswrapper[4728]: I1216 15:17:06.111261 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 15:17:06 crc kubenswrapper[4728]: I1216 15:17:06.118447 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 15:17:06 crc kubenswrapper[4728]: I1216 15:17:06.812220 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 15:17:06 crc kubenswrapper[4728]: I1216 15:17:06.812946 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 15:17:06 crc kubenswrapper[4728]: I1216 15:17:06.814639 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 15:17:06 crc kubenswrapper[4728]: I1216 15:17:06.816216 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 15:17:07 crc kubenswrapper[4728]: I1216 15:17:07.659493 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 15:17:07 crc kubenswrapper[4728]: I1216 15:17:07.663769 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 15:17:07 crc kubenswrapper[4728]: I1216 15:17:07.838789 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-l6m2w"] Dec 16 15:17:07 crc kubenswrapper[4728]: I1216 15:17:07.840453 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:07 crc kubenswrapper[4728]: I1216 15:17:07.850907 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-l6m2w"] Dec 16 15:17:07 crc kubenswrapper[4728]: I1216 15:17:07.991026 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-config\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:07 crc kubenswrapper[4728]: I1216 15:17:07.991353 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:07 crc kubenswrapper[4728]: I1216 15:17:07.991399 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2k27\" (UniqueName: \"kubernetes.io/projected/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-kube-api-access-d2k27\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:07 crc kubenswrapper[4728]: I1216 15:17:07.991479 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:07 crc kubenswrapper[4728]: I1216 15:17:07.991522 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:07 crc kubenswrapper[4728]: I1216 15:17:07.991547 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.093257 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-config\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.093546 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.093665 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2k27\" (UniqueName: \"kubernetes.io/projected/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-kube-api-access-d2k27\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.093766 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.093866 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.093955 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.094773 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.094809 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-config\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.094960 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.094986 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.094958 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.115328 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2k27\" (UniqueName: \"kubernetes.io/projected/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-kube-api-access-d2k27\") pod \"dnsmasq-dns-89c5cd4d5-l6m2w\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.168129 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:08 crc kubenswrapper[4728]: E1216 15:17:08.445094 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f3c8e2b_5ba5_482f_9e11_b3daacb44963.slice/crio-b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.689746 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.713101 4728 generic.go:334] "Generic (PLEG): container finished" podID="1f3c8e2b-5ba5-482f-9e11-b3daacb44963" containerID="b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12" exitCode=137 Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.714207 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.714387 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f3c8e2b-5ba5-482f-9e11-b3daacb44963","Type":"ContainerDied","Data":"b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12"} Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.714438 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f3c8e2b-5ba5-482f-9e11-b3daacb44963","Type":"ContainerDied","Data":"a2bc05e2234fec534688ba06e26f955c96cc43eef7b21b32d534f572dfa9ec7b"} Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.714458 4728 scope.go:117] "RemoveContainer" containerID="b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.794852 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-l6m2w"] Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.811425 4728 scope.go:117] "RemoveContainer" containerID="b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12" Dec 16 15:17:08 crc kubenswrapper[4728]: E1216 15:17:08.811902 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12\": container with ID starting with b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12 not found: ID does not exist" containerID="b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.811956 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12"} err="failed to get container status \"b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12\": rpc error: code = NotFound desc = could not find container \"b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12\": container with ID starting with b0433b7062a661e3f6486a5519ff7840422111f40ef39a68924b8a12abd71e12 not found: ID does not exist" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.812144 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-config-data\") pod \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.812193 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pzfb\" (UniqueName: \"kubernetes.io/projected/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-kube-api-access-7pzfb\") pod \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.812462 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-combined-ca-bundle\") pod \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\" (UID: \"1f3c8e2b-5ba5-482f-9e11-b3daacb44963\") " Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.819676 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.819731 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.820347 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-kube-api-access-7pzfb" (OuterVolumeSpecName: "kube-api-access-7pzfb") pod "1f3c8e2b-5ba5-482f-9e11-b3daacb44963" (UID: "1f3c8e2b-5ba5-482f-9e11-b3daacb44963"). InnerVolumeSpecName "kube-api-access-7pzfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.821997 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pzfb\" (UniqueName: \"kubernetes.io/projected/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-kube-api-access-7pzfb\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.845987 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-config-data" (OuterVolumeSpecName: "config-data") pod "1f3c8e2b-5ba5-482f-9e11-b3daacb44963" (UID: "1f3c8e2b-5ba5-482f-9e11-b3daacb44963"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.846519 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f3c8e2b-5ba5-482f-9e11-b3daacb44963" (UID: "1f3c8e2b-5ba5-482f-9e11-b3daacb44963"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.923149 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:08 crc kubenswrapper[4728]: I1216 15:17:08.923178 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f3c8e2b-5ba5-482f-9e11-b3daacb44963-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.050671 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.058589 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.093883 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:17:09 crc kubenswrapper[4728]: E1216 15:17:09.094518 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f3c8e2b-5ba5-482f-9e11-b3daacb44963" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.094552 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3c8e2b-5ba5-482f-9e11-b3daacb44963" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.094845 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f3c8e2b-5ba5-482f-9e11-b3daacb44963" containerName="nova-cell1-novncproxy-novncproxy" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.096010 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.101961 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.102193 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.102341 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.122639 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.235702 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hclhh\" (UniqueName: \"kubernetes.io/projected/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-kube-api-access-hclhh\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.236193 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.236327 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.236361 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.236453 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.338309 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hclhh\" (UniqueName: \"kubernetes.io/projected/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-kube-api-access-hclhh\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.338577 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.338685 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.338758 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.338865 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.342974 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.343075 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.343199 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.343476 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.355975 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hclhh\" (UniqueName: \"kubernetes.io/projected/25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc-kube-api-access-hclhh\") pod \"nova-cell1-novncproxy-0\" (UID: \"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.513756 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.520849 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f3c8e2b-5ba5-482f-9e11-b3daacb44963" path="/var/lib/kubelet/pods/1f3c8e2b-5ba5-482f-9e11-b3daacb44963/volumes" Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.729845 4728 generic.go:334] "Generic (PLEG): container finished" podID="d6f19c63-40d1-4c3d-9c6d-027581f63b2c" containerID="f30ca02fda9164f8f0c212f9f9ae37a4dfc0dd34c3d73ec3d958d8cdb7296f99" exitCode=0 Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.730267 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" event={"ID":"d6f19c63-40d1-4c3d-9c6d-027581f63b2c","Type":"ContainerDied","Data":"f30ca02fda9164f8f0c212f9f9ae37a4dfc0dd34c3d73ec3d958d8cdb7296f99"} Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.730309 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" event={"ID":"d6f19c63-40d1-4c3d-9c6d-027581f63b2c","Type":"ContainerStarted","Data":"7f6084bd837fff635fefdc76235be32415252f425e11871db0f24adc90522b95"} Dec 16 15:17:09 crc kubenswrapper[4728]: W1216 15:17:09.981497 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25eae3b4_b392_4ce4_b16c_5aa9d2cb78fc.slice/crio-ad6e9ca32815f6879a0eec7fe4a24a3c1627c8a6dc51f971d0378e57a70b0e4e WatchSource:0}: Error finding container ad6e9ca32815f6879a0eec7fe4a24a3c1627c8a6dc51f971d0378e57a70b0e4e: Status 404 returned error can't find the container with id ad6e9ca32815f6879a0eec7fe4a24a3c1627c8a6dc51f971d0378e57a70b0e4e Dec 16 15:17:09 crc kubenswrapper[4728]: I1216 15:17:09.988116 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.133461 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.134083 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="ceilometer-central-agent" containerID="cri-o://70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21" gracePeriod=30 Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.134152 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="sg-core" containerID="cri-o://ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e" gracePeriod=30 Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.134228 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="ceilometer-notification-agent" containerID="cri-o://1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641" gracePeriod=30 Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.134252 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="proxy-httpd" containerID="cri-o://190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a" gracePeriod=30 Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.163901 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.196:3000/\": read tcp 10.217.0.2:51612->10.217.0.196:3000: read: connection reset by peer" Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.741664 4728 generic.go:334] "Generic (PLEG): container finished" podID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerID="190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a" exitCode=0 Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.741702 4728 generic.go:334] "Generic (PLEG): container finished" podID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerID="ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e" exitCode=2 Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.741712 4728 generic.go:334] "Generic (PLEG): container finished" podID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerID="70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21" exitCode=0 Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.741709 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e979bde-daa7-4707-8198-8d306ff07e3f","Type":"ContainerDied","Data":"190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a"} Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.741760 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e979bde-daa7-4707-8198-8d306ff07e3f","Type":"ContainerDied","Data":"ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e"} Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.741774 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e979bde-daa7-4707-8198-8d306ff07e3f","Type":"ContainerDied","Data":"70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21"} Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.749161 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" event={"ID":"d6f19c63-40d1-4c3d-9c6d-027581f63b2c","Type":"ContainerStarted","Data":"aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c"} Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.749314 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.751353 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc","Type":"ContainerStarted","Data":"98009bdbe0f0be1818d71663b3b9494c0822dfb5afcc499008525eb02783c48e"} Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.751383 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc","Type":"ContainerStarted","Data":"ad6e9ca32815f6879a0eec7fe4a24a3c1627c8a6dc51f971d0378e57a70b0e4e"} Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.776239 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" podStartSLOduration=3.77621727 podStartE2EDuration="3.77621727s" podCreationTimestamp="2025-12-16 15:17:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:10.767568045 +0000 UTC m=+1211.607747049" watchObservedRunningTime="2025-12-16 15:17:10.77621727 +0000 UTC m=+1211.616396254" Dec 16 15:17:10 crc kubenswrapper[4728]: I1216 15:17:10.791048 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.791028923 podStartE2EDuration="1.791028923s" podCreationTimestamp="2025-12-16 15:17:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:10.787552678 +0000 UTC m=+1211.627731702" watchObservedRunningTime="2025-12-16 15:17:10.791028923 +0000 UTC m=+1211.631207907" Dec 16 15:17:11 crc kubenswrapper[4728]: I1216 15:17:11.441654 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:11 crc kubenswrapper[4728]: I1216 15:17:11.442121 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerName="nova-api-log" containerID="cri-o://86ed6566693bb505ad5447b473122f307aa6bc23a30cd06c77c6fec0d3aa1783" gracePeriod=30 Dec 16 15:17:11 crc kubenswrapper[4728]: I1216 15:17:11.442274 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerName="nova-api-api" containerID="cri-o://961ae57e2d1eab5c2805e7fc1df484744e69ffaaea8745145761363a9770965f" gracePeriod=30 Dec 16 15:17:11 crc kubenswrapper[4728]: I1216 15:17:11.789108 4728 generic.go:334] "Generic (PLEG): container finished" podID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerID="86ed6566693bb505ad5447b473122f307aa6bc23a30cd06c77c6fec0d3aa1783" exitCode=143 Dec 16 15:17:11 crc kubenswrapper[4728]: I1216 15:17:11.789960 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49","Type":"ContainerDied","Data":"86ed6566693bb505ad5447b473122f307aa6bc23a30cd06c77c6fec0d3aa1783"} Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.393954 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.507157 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-scripts\") pod \"6e979bde-daa7-4707-8198-8d306ff07e3f\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.507254 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-config-data\") pod \"6e979bde-daa7-4707-8198-8d306ff07e3f\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.507304 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-combined-ca-bundle\") pod \"6e979bde-daa7-4707-8198-8d306ff07e3f\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.507349 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc8g7\" (UniqueName: \"kubernetes.io/projected/6e979bde-daa7-4707-8198-8d306ff07e3f-kube-api-access-wc8g7\") pod \"6e979bde-daa7-4707-8198-8d306ff07e3f\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.507386 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-run-httpd\") pod \"6e979bde-daa7-4707-8198-8d306ff07e3f\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.507427 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-ceilometer-tls-certs\") pod \"6e979bde-daa7-4707-8198-8d306ff07e3f\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.507547 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-sg-core-conf-yaml\") pod \"6e979bde-daa7-4707-8198-8d306ff07e3f\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.507584 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-log-httpd\") pod \"6e979bde-daa7-4707-8198-8d306ff07e3f\" (UID: \"6e979bde-daa7-4707-8198-8d306ff07e3f\") " Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.508256 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e979bde-daa7-4707-8198-8d306ff07e3f" (UID: "6e979bde-daa7-4707-8198-8d306ff07e3f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.508367 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e979bde-daa7-4707-8198-8d306ff07e3f" (UID: "6e979bde-daa7-4707-8198-8d306ff07e3f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.513960 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-scripts" (OuterVolumeSpecName: "scripts") pod "6e979bde-daa7-4707-8198-8d306ff07e3f" (UID: "6e979bde-daa7-4707-8198-8d306ff07e3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.514058 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e979bde-daa7-4707-8198-8d306ff07e3f-kube-api-access-wc8g7" (OuterVolumeSpecName: "kube-api-access-wc8g7") pod "6e979bde-daa7-4707-8198-8d306ff07e3f" (UID: "6e979bde-daa7-4707-8198-8d306ff07e3f"). InnerVolumeSpecName "kube-api-access-wc8g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.543611 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e979bde-daa7-4707-8198-8d306ff07e3f" (UID: "6e979bde-daa7-4707-8198-8d306ff07e3f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.564222 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6e979bde-daa7-4707-8198-8d306ff07e3f" (UID: "6e979bde-daa7-4707-8198-8d306ff07e3f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.583926 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e979bde-daa7-4707-8198-8d306ff07e3f" (UID: "6e979bde-daa7-4707-8198-8d306ff07e3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.607988 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-config-data" (OuterVolumeSpecName: "config-data") pod "6e979bde-daa7-4707-8198-8d306ff07e3f" (UID: "6e979bde-daa7-4707-8198-8d306ff07e3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.610317 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc8g7\" (UniqueName: \"kubernetes.io/projected/6e979bde-daa7-4707-8198-8d306ff07e3f-kube-api-access-wc8g7\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.610331 4728 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.610339 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.610349 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.610383 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e979bde-daa7-4707-8198-8d306ff07e3f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.610391 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.610399 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.610542 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e979bde-daa7-4707-8198-8d306ff07e3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.798681 4728 generic.go:334] "Generic (PLEG): container finished" podID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerID="1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641" exitCode=0 Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.798728 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e979bde-daa7-4707-8198-8d306ff07e3f","Type":"ContainerDied","Data":"1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641"} Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.798755 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e979bde-daa7-4707-8198-8d306ff07e3f","Type":"ContainerDied","Data":"0e61ae102a3477e6706cc97b14de91fc5aa5ce8d8c5459ceb9fed720a66bea4d"} Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.798773 4728 scope.go:117] "RemoveContainer" containerID="190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.798915 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.838219 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.840162 4728 scope.go:117] "RemoveContainer" containerID="ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.849204 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.858035 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:12 crc kubenswrapper[4728]: E1216 15:17:12.858521 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="ceilometer-central-agent" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.858545 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="ceilometer-central-agent" Dec 16 15:17:12 crc kubenswrapper[4728]: E1216 15:17:12.858568 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="proxy-httpd" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.858578 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="proxy-httpd" Dec 16 15:17:12 crc kubenswrapper[4728]: E1216 15:17:12.858605 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="sg-core" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.858614 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="sg-core" Dec 16 15:17:12 crc kubenswrapper[4728]: E1216 15:17:12.858636 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="ceilometer-notification-agent" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.858645 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="ceilometer-notification-agent" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.858873 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="ceilometer-notification-agent" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.858895 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="proxy-httpd" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.858910 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="sg-core" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.858928 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" containerName="ceilometer-central-agent" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.861146 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.864469 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.864552 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.864657 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.864929 4728 scope.go:117] "RemoveContainer" containerID="1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.871898 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.902624 4728 scope.go:117] "RemoveContainer" containerID="70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.924795 4728 scope.go:117] "RemoveContainer" containerID="190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a" Dec 16 15:17:12 crc kubenswrapper[4728]: E1216 15:17:12.925231 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a\": container with ID starting with 190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a not found: ID does not exist" containerID="190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.925276 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a"} err="failed to get container status \"190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a\": rpc error: code = NotFound desc = could not find container \"190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a\": container with ID starting with 190f82e0a005391bcf55fdf989a1e2093f11639d7588cc40328ca2f6d698e36a not found: ID does not exist" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.925302 4728 scope.go:117] "RemoveContainer" containerID="ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e" Dec 16 15:17:12 crc kubenswrapper[4728]: E1216 15:17:12.925645 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e\": container with ID starting with ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e not found: ID does not exist" containerID="ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.925759 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e"} err="failed to get container status \"ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e\": rpc error: code = NotFound desc = could not find container \"ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e\": container with ID starting with ebb32c08ea80e632b93a05596d2d410f15e3d52e6c4abec6fae9df0cfec3400e not found: ID does not exist" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.925856 4728 scope.go:117] "RemoveContainer" containerID="1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641" Dec 16 15:17:12 crc kubenswrapper[4728]: E1216 15:17:12.926265 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641\": container with ID starting with 1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641 not found: ID does not exist" containerID="1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.926325 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641"} err="failed to get container status \"1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641\": rpc error: code = NotFound desc = could not find container \"1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641\": container with ID starting with 1aa01303a243452eeee3eb692ce909e78e852174c951575d39b73f02c5bb4641 not found: ID does not exist" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.926340 4728 scope.go:117] "RemoveContainer" containerID="70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21" Dec 16 15:17:12 crc kubenswrapper[4728]: E1216 15:17:12.926735 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21\": container with ID starting with 70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21 not found: ID does not exist" containerID="70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21" Dec 16 15:17:12 crc kubenswrapper[4728]: I1216 15:17:12.926838 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21"} err="failed to get container status \"70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21\": rpc error: code = NotFound desc = could not find container \"70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21\": container with ID starting with 70a2a823540c6fbdbfd417a82aa4ffe4cd1b2ac4ea46410fbc20e7b4122cdf21 not found: ID does not exist" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.016693 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-run-httpd\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.016987 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l54mh\" (UniqueName: \"kubernetes.io/projected/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-kube-api-access-l54mh\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.017021 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.017040 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-scripts\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.017061 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-log-httpd\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.017106 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.017187 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-config-data\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.017217 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.118439 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.118979 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-config-data\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.119088 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.119178 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-run-httpd\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.119295 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l54mh\" (UniqueName: \"kubernetes.io/projected/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-kube-api-access-l54mh\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.119382 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.119499 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-scripts\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.119615 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-log-httpd\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.119642 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-run-httpd\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.119913 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-log-httpd\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.123300 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.124253 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.124301 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.124780 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-scripts\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.125439 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-config-data\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.138180 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l54mh\" (UniqueName: \"kubernetes.io/projected/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-kube-api-access-l54mh\") pod \"ceilometer-0\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.179946 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.516955 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e979bde-daa7-4707-8198-8d306ff07e3f" path="/var/lib/kubelet/pods/6e979bde-daa7-4707-8198-8d306ff07e3f/volumes" Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.652857 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.796587 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:13 crc kubenswrapper[4728]: I1216 15:17:13.809826 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9","Type":"ContainerStarted","Data":"27396769306a9514cc58ab1ab835aa09f5f73b2d681f7f0adea11daf7eb34d33"} Dec 16 15:17:14 crc kubenswrapper[4728]: I1216 15:17:14.515048 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:14 crc kubenswrapper[4728]: I1216 15:17:14.827772 4728 generic.go:334] "Generic (PLEG): container finished" podID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerID="961ae57e2d1eab5c2805e7fc1df484744e69ffaaea8745145761363a9770965f" exitCode=0 Dec 16 15:17:14 crc kubenswrapper[4728]: I1216 15:17:14.827825 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49","Type":"ContainerDied","Data":"961ae57e2d1eab5c2805e7fc1df484744e69ffaaea8745145761363a9770965f"} Dec 16 15:17:14 crc kubenswrapper[4728]: I1216 15:17:14.829730 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9","Type":"ContainerStarted","Data":"3dc22c3bbeef54452440a9523a8aed13a666d7da8e2e4ff805714dafe4c19f5f"} Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.126976 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.188209 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-logs\") pod \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.188565 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r92j\" (UniqueName: \"kubernetes.io/projected/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-kube-api-access-6r92j\") pod \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.188609 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-config-data\") pod \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.188642 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-combined-ca-bundle\") pod \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\" (UID: \"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49\") " Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.194514 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-logs" (OuterVolumeSpecName: "logs") pod "ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" (UID: "ea4dd890-0d1f-4db3-adb6-2b4422cc9d49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.215591 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-kube-api-access-6r92j" (OuterVolumeSpecName: "kube-api-access-6r92j") pod "ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" (UID: "ea4dd890-0d1f-4db3-adb6-2b4422cc9d49"). InnerVolumeSpecName "kube-api-access-6r92j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.254302 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" (UID: "ea4dd890-0d1f-4db3-adb6-2b4422cc9d49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.280586 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-config-data" (OuterVolumeSpecName: "config-data") pod "ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" (UID: "ea4dd890-0d1f-4db3-adb6-2b4422cc9d49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.290128 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.290159 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r92j\" (UniqueName: \"kubernetes.io/projected/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-kube-api-access-6r92j\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.290171 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.290184 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.858171 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea4dd890-0d1f-4db3-adb6-2b4422cc9d49","Type":"ContainerDied","Data":"1c16b6d4e240ecf1af42a645f0c20b8296ba354853791afea31ea0bbba23c5db"} Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.858214 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.858609 4728 scope.go:117] "RemoveContainer" containerID="961ae57e2d1eab5c2805e7fc1df484744e69ffaaea8745145761363a9770965f" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.865015 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9","Type":"ContainerStarted","Data":"c5716f039d5208c9524406c4f1a1382099e93f36100c25d7df7c2e2fdf200a9e"} Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.941579 4728 scope.go:117] "RemoveContainer" containerID="86ed6566693bb505ad5447b473122f307aa6bc23a30cd06c77c6fec0d3aa1783" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.955715 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.977108 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.987997 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:15 crc kubenswrapper[4728]: E1216 15:17:15.988611 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerName="nova-api-log" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.988628 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerName="nova-api-log" Dec 16 15:17:15 crc kubenswrapper[4728]: E1216 15:17:15.988639 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerName="nova-api-api" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.988647 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerName="nova-api-api" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.988861 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerName="nova-api-log" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.988886 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" containerName="nova-api-api" Dec 16 15:17:15 crc kubenswrapper[4728]: I1216 15:17:15.990171 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.003907 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.004148 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.004300 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.007601 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.106683 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-internal-tls-certs\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.107435 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91fcaf90-dcf6-472f-a45d-58749d47bf43-logs\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.107646 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-public-tls-certs\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.107723 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.107815 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfnnz\" (UniqueName: \"kubernetes.io/projected/91fcaf90-dcf6-472f-a45d-58749d47bf43-kube-api-access-zfnnz\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.107862 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-config-data\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.209645 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-internal-tls-certs\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.209726 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91fcaf90-dcf6-472f-a45d-58749d47bf43-logs\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.209748 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-public-tls-certs\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.209767 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.209830 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfnnz\" (UniqueName: \"kubernetes.io/projected/91fcaf90-dcf6-472f-a45d-58749d47bf43-kube-api-access-zfnnz\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.209874 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-config-data\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.214522 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91fcaf90-dcf6-472f-a45d-58749d47bf43-logs\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.216184 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-config-data\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.216930 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-public-tls-certs\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.222096 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.229749 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-internal-tls-certs\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.229751 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfnnz\" (UniqueName: \"kubernetes.io/projected/91fcaf90-dcf6-472f-a45d-58749d47bf43-kube-api-access-zfnnz\") pod \"nova-api-0\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.316719 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.760053 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.879573 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91fcaf90-dcf6-472f-a45d-58749d47bf43","Type":"ContainerStarted","Data":"93805e4fcce6f8a09ce8f476e0e67787142952878b6963a85471dac4ff1f8b26"} Dec 16 15:17:16 crc kubenswrapper[4728]: I1216 15:17:16.881785 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9","Type":"ContainerStarted","Data":"759832c0aacbb3f883a4f29ad47fcf4d5af5d85a9904f22d575d5d36d419d606"} Dec 16 15:17:17 crc kubenswrapper[4728]: I1216 15:17:17.515845 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea4dd890-0d1f-4db3-adb6-2b4422cc9d49" path="/var/lib/kubelet/pods/ea4dd890-0d1f-4db3-adb6-2b4422cc9d49/volumes" Dec 16 15:17:17 crc kubenswrapper[4728]: I1216 15:17:17.891721 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91fcaf90-dcf6-472f-a45d-58749d47bf43","Type":"ContainerStarted","Data":"4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41"} Dec 16 15:17:17 crc kubenswrapper[4728]: I1216 15:17:17.891766 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91fcaf90-dcf6-472f-a45d-58749d47bf43","Type":"ContainerStarted","Data":"0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f"} Dec 16 15:17:17 crc kubenswrapper[4728]: I1216 15:17:17.935119 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.9350892330000002 podStartE2EDuration="2.935089233s" podCreationTimestamp="2025-12-16 15:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:17.913451595 +0000 UTC m=+1218.753630589" watchObservedRunningTime="2025-12-16 15:17:17.935089233 +0000 UTC m=+1218.775268247" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.169562 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.260308 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-c7xpj"] Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.260607 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" podUID="0ed56382-f55a-4e37-9ef1-c0725189578a" containerName="dnsmasq-dns" containerID="cri-o://2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa" gracePeriod=10 Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.756679 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.782777 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vt5f\" (UniqueName: \"kubernetes.io/projected/0ed56382-f55a-4e37-9ef1-c0725189578a-kube-api-access-5vt5f\") pod \"0ed56382-f55a-4e37-9ef1-c0725189578a\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.782887 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-config\") pod \"0ed56382-f55a-4e37-9ef1-c0725189578a\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.782937 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-nb\") pod \"0ed56382-f55a-4e37-9ef1-c0725189578a\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.782962 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-sb\") pod \"0ed56382-f55a-4e37-9ef1-c0725189578a\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.783037 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-swift-storage-0\") pod \"0ed56382-f55a-4e37-9ef1-c0725189578a\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.783108 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-svc\") pod \"0ed56382-f55a-4e37-9ef1-c0725189578a\" (UID: \"0ed56382-f55a-4e37-9ef1-c0725189578a\") " Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.795767 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed56382-f55a-4e37-9ef1-c0725189578a-kube-api-access-5vt5f" (OuterVolumeSpecName: "kube-api-access-5vt5f") pod "0ed56382-f55a-4e37-9ef1-c0725189578a" (UID: "0ed56382-f55a-4e37-9ef1-c0725189578a"). InnerVolumeSpecName "kube-api-access-5vt5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.860065 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ed56382-f55a-4e37-9ef1-c0725189578a" (UID: "0ed56382-f55a-4e37-9ef1-c0725189578a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.863369 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0ed56382-f55a-4e37-9ef1-c0725189578a" (UID: "0ed56382-f55a-4e37-9ef1-c0725189578a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.869642 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-config" (OuterVolumeSpecName: "config") pod "0ed56382-f55a-4e37-9ef1-c0725189578a" (UID: "0ed56382-f55a-4e37-9ef1-c0725189578a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.886855 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vt5f\" (UniqueName: \"kubernetes.io/projected/0ed56382-f55a-4e37-9ef1-c0725189578a-kube-api-access-5vt5f\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.886885 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.886894 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.886911 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.893832 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ed56382-f55a-4e37-9ef1-c0725189578a" (UID: "0ed56382-f55a-4e37-9ef1-c0725189578a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.902724 4728 generic.go:334] "Generic (PLEG): container finished" podID="0ed56382-f55a-4e37-9ef1-c0725189578a" containerID="2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa" exitCode=0 Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.903092 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.903689 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" event={"ID":"0ed56382-f55a-4e37-9ef1-c0725189578a","Type":"ContainerDied","Data":"2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa"} Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.903741 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-c7xpj" event={"ID":"0ed56382-f55a-4e37-9ef1-c0725189578a","Type":"ContainerDied","Data":"08efae766f9067d98e209d5177d43466ebb92c0082374974ddad32b2f60e63cc"} Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.903760 4728 scope.go:117] "RemoveContainer" containerID="2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.906578 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ed56382-f55a-4e37-9ef1-c0725189578a" (UID: "0ed56382-f55a-4e37-9ef1-c0725189578a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.977497 4728 scope.go:117] "RemoveContainer" containerID="e8e92ac33e785a8b45e5457bcccaef16500e94fbff2eaafbefd763a86e0db357" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.988900 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:18 crc kubenswrapper[4728]: I1216 15:17:18.988925 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ed56382-f55a-4e37-9ef1-c0725189578a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:19 crc kubenswrapper[4728]: I1216 15:17:19.009115 4728 scope.go:117] "RemoveContainer" containerID="2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa" Dec 16 15:17:19 crc kubenswrapper[4728]: E1216 15:17:19.009584 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa\": container with ID starting with 2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa not found: ID does not exist" containerID="2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa" Dec 16 15:17:19 crc kubenswrapper[4728]: I1216 15:17:19.009642 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa"} err="failed to get container status \"2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa\": rpc error: code = NotFound desc = could not find container \"2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa\": container with ID starting with 2a743aeb0b030be4554e6a36428113d80bc617e55ef354bb55e7e6d8e569bdfa not found: ID does not exist" Dec 16 15:17:19 crc kubenswrapper[4728]: I1216 15:17:19.009668 4728 scope.go:117] "RemoveContainer" containerID="e8e92ac33e785a8b45e5457bcccaef16500e94fbff2eaafbefd763a86e0db357" Dec 16 15:17:19 crc kubenswrapper[4728]: E1216 15:17:19.010040 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8e92ac33e785a8b45e5457bcccaef16500e94fbff2eaafbefd763a86e0db357\": container with ID starting with e8e92ac33e785a8b45e5457bcccaef16500e94fbff2eaafbefd763a86e0db357 not found: ID does not exist" containerID="e8e92ac33e785a8b45e5457bcccaef16500e94fbff2eaafbefd763a86e0db357" Dec 16 15:17:19 crc kubenswrapper[4728]: I1216 15:17:19.010085 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8e92ac33e785a8b45e5457bcccaef16500e94fbff2eaafbefd763a86e0db357"} err="failed to get container status \"e8e92ac33e785a8b45e5457bcccaef16500e94fbff2eaafbefd763a86e0db357\": rpc error: code = NotFound desc = could not find container \"e8e92ac33e785a8b45e5457bcccaef16500e94fbff2eaafbefd763a86e0db357\": container with ID starting with e8e92ac33e785a8b45e5457bcccaef16500e94fbff2eaafbefd763a86e0db357 not found: ID does not exist" Dec 16 15:17:19 crc kubenswrapper[4728]: I1216 15:17:19.251371 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-c7xpj"] Dec 16 15:17:19 crc kubenswrapper[4728]: I1216 15:17:19.261693 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-c7xpj"] Dec 16 15:17:19 crc kubenswrapper[4728]: I1216 15:17:19.533911 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ed56382-f55a-4e37-9ef1-c0725189578a" path="/var/lib/kubelet/pods/0ed56382-f55a-4e37-9ef1-c0725189578a/volumes" Dec 16 15:17:19 crc kubenswrapper[4728]: I1216 15:17:19.534618 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:19 crc kubenswrapper[4728]: I1216 15:17:19.565268 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:19 crc kubenswrapper[4728]: I1216 15:17:19.934129 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.154541 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-xzs88"] Dec 16 15:17:20 crc kubenswrapper[4728]: E1216 15:17:20.154893 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed56382-f55a-4e37-9ef1-c0725189578a" containerName="init" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.154904 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed56382-f55a-4e37-9ef1-c0725189578a" containerName="init" Dec 16 15:17:20 crc kubenswrapper[4728]: E1216 15:17:20.154926 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed56382-f55a-4e37-9ef1-c0725189578a" containerName="dnsmasq-dns" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.154932 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed56382-f55a-4e37-9ef1-c0725189578a" containerName="dnsmasq-dns" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.155113 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed56382-f55a-4e37-9ef1-c0725189578a" containerName="dnsmasq-dns" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.155761 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.160275 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.166799 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.176258 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xzs88"] Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.222068 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-scripts\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.222132 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4pc9\" (UniqueName: \"kubernetes.io/projected/e115a66a-589a-4762-b4d8-d80146360675-kube-api-access-b4pc9\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.222171 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.222226 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-config-data\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.323521 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4pc9\" (UniqueName: \"kubernetes.io/projected/e115a66a-589a-4762-b4d8-d80146360675-kube-api-access-b4pc9\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.323627 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.324586 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-config-data\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.324741 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-scripts\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.329242 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.329541 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-scripts\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.336097 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-config-data\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.342911 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4pc9\" (UniqueName: \"kubernetes.io/projected/e115a66a-589a-4762-b4d8-d80146360675-kube-api-access-b4pc9\") pod \"nova-cell1-cell-mapping-xzs88\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.476328 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.928206 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9","Type":"ContainerStarted","Data":"c85d16479a524418702955cb14d20337559eb2d11ff0b7aa159f3f07742f2359"} Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.928977 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="ceilometer-central-agent" containerID="cri-o://3dc22c3bbeef54452440a9523a8aed13a666d7da8e2e4ff805714dafe4c19f5f" gracePeriod=30 Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.929222 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="proxy-httpd" containerID="cri-o://c85d16479a524418702955cb14d20337559eb2d11ff0b7aa159f3f07742f2359" gracePeriod=30 Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.929328 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="sg-core" containerID="cri-o://759832c0aacbb3f883a4f29ad47fcf4d5af5d85a9904f22d575d5d36d419d606" gracePeriod=30 Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.929402 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="ceilometer-notification-agent" containerID="cri-o://c5716f039d5208c9524406c4f1a1382099e93f36100c25d7df7c2e2fdf200a9e" gracePeriod=30 Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.945214 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xzs88"] Dec 16 15:17:20 crc kubenswrapper[4728]: W1216 15:17:20.949203 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode115a66a_589a_4762_b4d8_d80146360675.slice/crio-1e32493adfde5450c4cccd239dc9cee4c95ef09a642634ec2e47d832289c57bc WatchSource:0}: Error finding container 1e32493adfde5450c4cccd239dc9cee4c95ef09a642634ec2e47d832289c57bc: Status 404 returned error can't find the container with id 1e32493adfde5450c4cccd239dc9cee4c95ef09a642634ec2e47d832289c57bc Dec 16 15:17:20 crc kubenswrapper[4728]: I1216 15:17:20.970496 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.131673595 podStartE2EDuration="8.97046179s" podCreationTimestamp="2025-12-16 15:17:12 +0000 UTC" firstStartedPulling="2025-12-16 15:17:13.662738292 +0000 UTC m=+1214.502917296" lastFinishedPulling="2025-12-16 15:17:20.501526507 +0000 UTC m=+1221.341705491" observedRunningTime="2025-12-16 15:17:20.952916573 +0000 UTC m=+1221.793095607" watchObservedRunningTime="2025-12-16 15:17:20.97046179 +0000 UTC m=+1221.810640774" Dec 16 15:17:21 crc kubenswrapper[4728]: I1216 15:17:21.939857 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xzs88" event={"ID":"e115a66a-589a-4762-b4d8-d80146360675","Type":"ContainerStarted","Data":"0db533ee323b74194ca8a61dc93d345eff8a404954a5bee6967ce494aee23d4c"} Dec 16 15:17:21 crc kubenswrapper[4728]: I1216 15:17:21.940575 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xzs88" event={"ID":"e115a66a-589a-4762-b4d8-d80146360675","Type":"ContainerStarted","Data":"1e32493adfde5450c4cccd239dc9cee4c95ef09a642634ec2e47d832289c57bc"} Dec 16 15:17:21 crc kubenswrapper[4728]: I1216 15:17:21.943278 4728 generic.go:334] "Generic (PLEG): container finished" podID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerID="c85d16479a524418702955cb14d20337559eb2d11ff0b7aa159f3f07742f2359" exitCode=0 Dec 16 15:17:21 crc kubenswrapper[4728]: I1216 15:17:21.943307 4728 generic.go:334] "Generic (PLEG): container finished" podID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerID="759832c0aacbb3f883a4f29ad47fcf4d5af5d85a9904f22d575d5d36d419d606" exitCode=2 Dec 16 15:17:21 crc kubenswrapper[4728]: I1216 15:17:21.943323 4728 generic.go:334] "Generic (PLEG): container finished" podID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerID="c5716f039d5208c9524406c4f1a1382099e93f36100c25d7df7c2e2fdf200a9e" exitCode=0 Dec 16 15:17:21 crc kubenswrapper[4728]: I1216 15:17:21.943334 4728 generic.go:334] "Generic (PLEG): container finished" podID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerID="3dc22c3bbeef54452440a9523a8aed13a666d7da8e2e4ff805714dafe4c19f5f" exitCode=0 Dec 16 15:17:21 crc kubenswrapper[4728]: I1216 15:17:21.943348 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9","Type":"ContainerDied","Data":"c85d16479a524418702955cb14d20337559eb2d11ff0b7aa159f3f07742f2359"} Dec 16 15:17:21 crc kubenswrapper[4728]: I1216 15:17:21.943376 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9","Type":"ContainerDied","Data":"759832c0aacbb3f883a4f29ad47fcf4d5af5d85a9904f22d575d5d36d419d606"} Dec 16 15:17:21 crc kubenswrapper[4728]: I1216 15:17:21.943391 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9","Type":"ContainerDied","Data":"c5716f039d5208c9524406c4f1a1382099e93f36100c25d7df7c2e2fdf200a9e"} Dec 16 15:17:21 crc kubenswrapper[4728]: I1216 15:17:21.943415 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9","Type":"ContainerDied","Data":"3dc22c3bbeef54452440a9523a8aed13a666d7da8e2e4ff805714dafe4c19f5f"} Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.683757 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-xzs88" podStartSLOduration=2.6837330379999997 podStartE2EDuration="2.683733038s" podCreationTimestamp="2025-12-16 15:17:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:22.669213394 +0000 UTC m=+1223.509392418" watchObservedRunningTime="2025-12-16 15:17:22.683733038 +0000 UTC m=+1223.523912022" Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.771226 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.950401 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-run-httpd\") pod \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.950539 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-config-data\") pod \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.950598 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-combined-ca-bundle\") pod \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.950763 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-sg-core-conf-yaml\") pod \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.950810 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-log-httpd\") pod \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.950856 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l54mh\" (UniqueName: \"kubernetes.io/projected/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-kube-api-access-l54mh\") pod \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.950892 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-ceilometer-tls-certs\") pod \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.950918 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-scripts\") pod \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\" (UID: \"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9\") " Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.951861 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" (UID: "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.951908 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" (UID: "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.956784 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-kube-api-access-l54mh" (OuterVolumeSpecName: "kube-api-access-l54mh") pod "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" (UID: "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9"). InnerVolumeSpecName "kube-api-access-l54mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.957088 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-scripts" (OuterVolumeSpecName: "scripts") pod "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" (UID: "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.959982 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8ac31c8-0e4b-4a34-8951-3b8e679a89e9","Type":"ContainerDied","Data":"27396769306a9514cc58ab1ab835aa09f5f73b2d681f7f0adea11daf7eb34d33"} Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.960017 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.960046 4728 scope.go:117] "RemoveContainer" containerID="c85d16479a524418702955cb14d20337559eb2d11ff0b7aa159f3f07742f2359" Dec 16 15:17:22 crc kubenswrapper[4728]: I1216 15:17:22.989014 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" (UID: "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.015597 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" (UID: "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.029138 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" (UID: "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.053231 4728 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.053429 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.053444 4728 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.053452 4728 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.053464 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l54mh\" (UniqueName: \"kubernetes.io/projected/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-kube-api-access-l54mh\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.053473 4728 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.053481 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.070742 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-config-data" (OuterVolumeSpecName: "config-data") pod "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" (UID: "c8ac31c8-0e4b-4a34-8951-3b8e679a89e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.144771 4728 scope.go:117] "RemoveContainer" containerID="759832c0aacbb3f883a4f29ad47fcf4d5af5d85a9904f22d575d5d36d419d606" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.155173 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.166659 4728 scope.go:117] "RemoveContainer" containerID="c5716f039d5208c9524406c4f1a1382099e93f36100c25d7df7c2e2fdf200a9e" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.186416 4728 scope.go:117] "RemoveContainer" containerID="3dc22c3bbeef54452440a9523a8aed13a666d7da8e2e4ff805714dafe4c19f5f" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.312650 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.342040 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.361554 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:23 crc kubenswrapper[4728]: E1216 15:17:23.362059 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="sg-core" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.362078 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="sg-core" Dec 16 15:17:23 crc kubenswrapper[4728]: E1216 15:17:23.362103 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="ceilometer-central-agent" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.362110 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="ceilometer-central-agent" Dec 16 15:17:23 crc kubenswrapper[4728]: E1216 15:17:23.362127 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="proxy-httpd" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.362134 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="proxy-httpd" Dec 16 15:17:23 crc kubenswrapper[4728]: E1216 15:17:23.362150 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="ceilometer-notification-agent" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.362156 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="ceilometer-notification-agent" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.362336 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="proxy-httpd" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.362360 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="ceilometer-central-agent" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.362376 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="sg-core" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.362384 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" containerName="ceilometer-notification-agent" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.364281 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.366839 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.367132 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.367379 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.386941 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.464193 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da26377-a29a-4acb-8ef8-4d17c1431d0b-log-httpd\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.464313 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.464387 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.464477 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da26377-a29a-4acb-8ef8-4d17c1431d0b-run-httpd\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.464551 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqtk8\" (UniqueName: \"kubernetes.io/projected/0da26377-a29a-4acb-8ef8-4d17c1431d0b-kube-api-access-lqtk8\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.464587 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.464633 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-scripts\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.464763 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-config-data\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.526298 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ac31c8-0e4b-4a34-8951-3b8e679a89e9" path="/var/lib/kubelet/pods/c8ac31c8-0e4b-4a34-8951-3b8e679a89e9/volumes" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.565833 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.565907 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.565940 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da26377-a29a-4acb-8ef8-4d17c1431d0b-run-httpd\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.565968 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqtk8\" (UniqueName: \"kubernetes.io/projected/0da26377-a29a-4acb-8ef8-4d17c1431d0b-kube-api-access-lqtk8\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.565995 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.566022 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-scripts\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.566092 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-config-data\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.566141 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da26377-a29a-4acb-8ef8-4d17c1431d0b-log-httpd\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.566628 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da26377-a29a-4acb-8ef8-4d17c1431d0b-log-httpd\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.567872 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0da26377-a29a-4acb-8ef8-4d17c1431d0b-run-httpd\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.571892 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.572523 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.572722 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.572750 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-scripts\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.572952 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da26377-a29a-4acb-8ef8-4d17c1431d0b-config-data\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.587692 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqtk8\" (UniqueName: \"kubernetes.io/projected/0da26377-a29a-4acb-8ef8-4d17c1431d0b-kube-api-access-lqtk8\") pod \"ceilometer-0\" (UID: \"0da26377-a29a-4acb-8ef8-4d17c1431d0b\") " pod="openstack/ceilometer-0" Dec 16 15:17:23 crc kubenswrapper[4728]: I1216 15:17:23.701289 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 16 15:17:24 crc kubenswrapper[4728]: W1216 15:17:24.162285 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0da26377_a29a_4acb_8ef8_4d17c1431d0b.slice/crio-aa5002d630f9ca262e93e8acf3d860dec4784344dff2b4ae05a776df08d96fed WatchSource:0}: Error finding container aa5002d630f9ca262e93e8acf3d860dec4784344dff2b4ae05a776df08d96fed: Status 404 returned error can't find the container with id aa5002d630f9ca262e93e8acf3d860dec4784344dff2b4ae05a776df08d96fed Dec 16 15:17:24 crc kubenswrapper[4728]: I1216 15:17:24.163267 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 16 15:17:24 crc kubenswrapper[4728]: I1216 15:17:24.997145 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da26377-a29a-4acb-8ef8-4d17c1431d0b","Type":"ContainerStarted","Data":"791d17f83f19737a1063ed73b740dbaa982daeb23e0df48dde5020938a953780"} Dec 16 15:17:24 crc kubenswrapper[4728]: I1216 15:17:24.997496 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da26377-a29a-4acb-8ef8-4d17c1431d0b","Type":"ContainerStarted","Data":"aa5002d630f9ca262e93e8acf3d860dec4784344dff2b4ae05a776df08d96fed"} Dec 16 15:17:26 crc kubenswrapper[4728]: I1216 15:17:26.007077 4728 generic.go:334] "Generic (PLEG): container finished" podID="e115a66a-589a-4762-b4d8-d80146360675" containerID="0db533ee323b74194ca8a61dc93d345eff8a404954a5bee6967ce494aee23d4c" exitCode=0 Dec 16 15:17:26 crc kubenswrapper[4728]: I1216 15:17:26.007634 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xzs88" event={"ID":"e115a66a-589a-4762-b4d8-d80146360675","Type":"ContainerDied","Data":"0db533ee323b74194ca8a61dc93d345eff8a404954a5bee6967ce494aee23d4c"} Dec 16 15:17:26 crc kubenswrapper[4728]: I1216 15:17:26.317355 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:17:26 crc kubenswrapper[4728]: I1216 15:17:26.317396 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.020239 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da26377-a29a-4acb-8ef8-4d17c1431d0b","Type":"ContainerStarted","Data":"3c62255e181726d4ce21e7c2dd7e55ca7d5a88bcd3a729c43485d8e212e3bb39"} Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.328581 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.328596 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.429099 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.540234 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-combined-ca-bundle\") pod \"e115a66a-589a-4762-b4d8-d80146360675\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.540346 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-scripts\") pod \"e115a66a-589a-4762-b4d8-d80146360675\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.540420 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-config-data\") pod \"e115a66a-589a-4762-b4d8-d80146360675\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.540462 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4pc9\" (UniqueName: \"kubernetes.io/projected/e115a66a-589a-4762-b4d8-d80146360675-kube-api-access-b4pc9\") pod \"e115a66a-589a-4762-b4d8-d80146360675\" (UID: \"e115a66a-589a-4762-b4d8-d80146360675\") " Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.546389 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-scripts" (OuterVolumeSpecName: "scripts") pod "e115a66a-589a-4762-b4d8-d80146360675" (UID: "e115a66a-589a-4762-b4d8-d80146360675"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.563599 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e115a66a-589a-4762-b4d8-d80146360675-kube-api-access-b4pc9" (OuterVolumeSpecName: "kube-api-access-b4pc9") pod "e115a66a-589a-4762-b4d8-d80146360675" (UID: "e115a66a-589a-4762-b4d8-d80146360675"). InnerVolumeSpecName "kube-api-access-b4pc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.591541 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-config-data" (OuterVolumeSpecName: "config-data") pod "e115a66a-589a-4762-b4d8-d80146360675" (UID: "e115a66a-589a-4762-b4d8-d80146360675"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.604491 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e115a66a-589a-4762-b4d8-d80146360675" (UID: "e115a66a-589a-4762-b4d8-d80146360675"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.643563 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.643599 4728 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-scripts\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.643610 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e115a66a-589a-4762-b4d8-d80146360675-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:27 crc kubenswrapper[4728]: I1216 15:17:27.643621 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4pc9\" (UniqueName: \"kubernetes.io/projected/e115a66a-589a-4762-b4d8-d80146360675-kube-api-access-b4pc9\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:28 crc kubenswrapper[4728]: I1216 15:17:28.034591 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xzs88" event={"ID":"e115a66a-589a-4762-b4d8-d80146360675","Type":"ContainerDied","Data":"1e32493adfde5450c4cccd239dc9cee4c95ef09a642634ec2e47d832289c57bc"} Dec 16 15:17:28 crc kubenswrapper[4728]: I1216 15:17:28.034660 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e32493adfde5450c4cccd239dc9cee4c95ef09a642634ec2e47d832289c57bc" Dec 16 15:17:28 crc kubenswrapper[4728]: I1216 15:17:28.034629 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xzs88" Dec 16 15:17:28 crc kubenswrapper[4728]: I1216 15:17:28.216865 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:28 crc kubenswrapper[4728]: I1216 15:17:28.217126 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerName="nova-api-log" containerID="cri-o://0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f" gracePeriod=30 Dec 16 15:17:28 crc kubenswrapper[4728]: I1216 15:17:28.217210 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerName="nova-api-api" containerID="cri-o://4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41" gracePeriod=30 Dec 16 15:17:28 crc kubenswrapper[4728]: I1216 15:17:28.238336 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:28 crc kubenswrapper[4728]: I1216 15:17:28.238836 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a27a5934-36e6-4c83-add1-e362af6bf332" containerName="nova-scheduler-scheduler" containerID="cri-o://a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762" gracePeriod=30 Dec 16 15:17:28 crc kubenswrapper[4728]: I1216 15:17:28.262034 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:28 crc kubenswrapper[4728]: I1216 15:17:28.262303 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-log" containerID="cri-o://9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116" gracePeriod=30 Dec 16 15:17:28 crc kubenswrapper[4728]: I1216 15:17:28.262830 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-metadata" containerID="cri-o://84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4" gracePeriod=30 Dec 16 15:17:29 crc kubenswrapper[4728]: I1216 15:17:29.046041 4728 generic.go:334] "Generic (PLEG): container finished" podID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerID="9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116" exitCode=143 Dec 16 15:17:29 crc kubenswrapper[4728]: I1216 15:17:29.046422 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c71d61c-fde8-4ba1-a572-aac714b424fe","Type":"ContainerDied","Data":"9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116"} Dec 16 15:17:29 crc kubenswrapper[4728]: I1216 15:17:29.048401 4728 generic.go:334] "Generic (PLEG): container finished" podID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerID="0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f" exitCode=143 Dec 16 15:17:29 crc kubenswrapper[4728]: I1216 15:17:29.048450 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91fcaf90-dcf6-472f-a45d-58749d47bf43","Type":"ContainerDied","Data":"0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f"} Dec 16 15:17:30 crc kubenswrapper[4728]: I1216 15:17:30.070649 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da26377-a29a-4acb-8ef8-4d17c1431d0b","Type":"ContainerStarted","Data":"56cf79cecf249d62aa1dad24d2ca3c6dba33276085f192a8a32872dc3717a96c"} Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.082070 4728 generic.go:334] "Generic (PLEG): container finished" podID="a27a5934-36e6-4c83-add1-e362af6bf332" containerID="a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762" exitCode=0 Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.082213 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a27a5934-36e6-4c83-add1-e362af6bf332","Type":"ContainerDied","Data":"a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762"} Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.087542 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0da26377-a29a-4acb-8ef8-4d17c1431d0b","Type":"ContainerStarted","Data":"d75702f8300f12ab2433fbb4b4112d73445d76a7ed595851f7c51efcd45f04f7"} Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.087811 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.112746 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.484588413 podStartE2EDuration="8.112722043s" podCreationTimestamp="2025-12-16 15:17:23 +0000 UTC" firstStartedPulling="2025-12-16 15:17:24.166560434 +0000 UTC m=+1225.006739418" lastFinishedPulling="2025-12-16 15:17:30.794694064 +0000 UTC m=+1231.634873048" observedRunningTime="2025-12-16 15:17:31.107042758 +0000 UTC m=+1231.947221742" watchObservedRunningTime="2025-12-16 15:17:31.112722043 +0000 UTC m=+1231.952901047" Dec 16 15:17:31 crc kubenswrapper[4728]: E1216 15:17:31.194169 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762 is running failed: container process not found" containerID="a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:17:31 crc kubenswrapper[4728]: E1216 15:17:31.194535 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762 is running failed: container process not found" containerID="a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:17:31 crc kubenswrapper[4728]: E1216 15:17:31.194782 4728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762 is running failed: container process not found" containerID="a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 16 15:17:31 crc kubenswrapper[4728]: E1216 15:17:31.194810 4728 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a27a5934-36e6-4c83-add1-e362af6bf332" containerName="nova-scheduler-scheduler" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.405537 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:52970->10.217.0.191:8775: read: connection reset by peer" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.405624 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:52968->10.217.0.191:8775: read: connection reset by peer" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.530568 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.619664 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-combined-ca-bundle\") pod \"a27a5934-36e6-4c83-add1-e362af6bf332\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.619813 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-config-data\") pod \"a27a5934-36e6-4c83-add1-e362af6bf332\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.619902 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nndn\" (UniqueName: \"kubernetes.io/projected/a27a5934-36e6-4c83-add1-e362af6bf332-kube-api-access-2nndn\") pod \"a27a5934-36e6-4c83-add1-e362af6bf332\" (UID: \"a27a5934-36e6-4c83-add1-e362af6bf332\") " Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.626660 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27a5934-36e6-4c83-add1-e362af6bf332-kube-api-access-2nndn" (OuterVolumeSpecName: "kube-api-access-2nndn") pod "a27a5934-36e6-4c83-add1-e362af6bf332" (UID: "a27a5934-36e6-4c83-add1-e362af6bf332"). InnerVolumeSpecName "kube-api-access-2nndn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.659073 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a27a5934-36e6-4c83-add1-e362af6bf332" (UID: "a27a5934-36e6-4c83-add1-e362af6bf332"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.683622 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-config-data" (OuterVolumeSpecName: "config-data") pod "a27a5934-36e6-4c83-add1-e362af6bf332" (UID: "a27a5934-36e6-4c83-add1-e362af6bf332"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.722394 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.722451 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nndn\" (UniqueName: \"kubernetes.io/projected/a27a5934-36e6-4c83-add1-e362af6bf332-kube-api-access-2nndn\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.722463 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27a5934-36e6-4c83-add1-e362af6bf332-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.810358 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.924977 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-config-data\") pod \"3c71d61c-fde8-4ba1-a572-aac714b424fe\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.925374 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-combined-ca-bundle\") pod \"3c71d61c-fde8-4ba1-a572-aac714b424fe\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.925560 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c71d61c-fde8-4ba1-a572-aac714b424fe-logs\") pod \"3c71d61c-fde8-4ba1-a572-aac714b424fe\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.925739 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-nova-metadata-tls-certs\") pod \"3c71d61c-fde8-4ba1-a572-aac714b424fe\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.925920 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxqf8\" (UniqueName: \"kubernetes.io/projected/3c71d61c-fde8-4ba1-a572-aac714b424fe-kube-api-access-wxqf8\") pod \"3c71d61c-fde8-4ba1-a572-aac714b424fe\" (UID: \"3c71d61c-fde8-4ba1-a572-aac714b424fe\") " Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.926691 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c71d61c-fde8-4ba1-a572-aac714b424fe-logs" (OuterVolumeSpecName: "logs") pod "3c71d61c-fde8-4ba1-a572-aac714b424fe" (UID: "3c71d61c-fde8-4ba1-a572-aac714b424fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.929512 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c71d61c-fde8-4ba1-a572-aac714b424fe-kube-api-access-wxqf8" (OuterVolumeSpecName: "kube-api-access-wxqf8") pod "3c71d61c-fde8-4ba1-a572-aac714b424fe" (UID: "3c71d61c-fde8-4ba1-a572-aac714b424fe"). InnerVolumeSpecName "kube-api-access-wxqf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.956911 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-config-data" (OuterVolumeSpecName: "config-data") pod "3c71d61c-fde8-4ba1-a572-aac714b424fe" (UID: "3c71d61c-fde8-4ba1-a572-aac714b424fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:31 crc kubenswrapper[4728]: I1216 15:17:31.961607 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c71d61c-fde8-4ba1-a572-aac714b424fe" (UID: "3c71d61c-fde8-4ba1-a572-aac714b424fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.006802 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3c71d61c-fde8-4ba1-a572-aac714b424fe" (UID: "3c71d61c-fde8-4ba1-a572-aac714b424fe"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.029170 4728 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.029200 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxqf8\" (UniqueName: \"kubernetes.io/projected/3c71d61c-fde8-4ba1-a572-aac714b424fe-kube-api-access-wxqf8\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.029209 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.029220 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c71d61c-fde8-4ba1-a572-aac714b424fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.029229 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c71d61c-fde8-4ba1-a572-aac714b424fe-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.106563 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a27a5934-36e6-4c83-add1-e362af6bf332","Type":"ContainerDied","Data":"56836d2f94bda55823de6c934ae39510aba0782b0b770f0c7227433d64cbd917"} Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.106613 4728 scope.go:117] "RemoveContainer" containerID="a6f2bee580df18695cc327571b9b5eb829640d83f539014e480022d7d4c77762" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.106741 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.112756 4728 generic.go:334] "Generic (PLEG): container finished" podID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerID="84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4" exitCode=0 Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.112848 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.112820 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c71d61c-fde8-4ba1-a572-aac714b424fe","Type":"ContainerDied","Data":"84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4"} Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.112978 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c71d61c-fde8-4ba1-a572-aac714b424fe","Type":"ContainerDied","Data":"ed48f383fe6c5222f6517c9eeb4ef4aab6f470f1cab31bc65c4123a33182fbc0"} Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.188946 4728 scope.go:117] "RemoveContainer" containerID="84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.229874 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.240262 4728 scope.go:117] "RemoveContainer" containerID="9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.261088 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.262672 4728 scope.go:117] "RemoveContainer" containerID="84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4" Dec 16 15:17:32 crc kubenswrapper[4728]: E1216 15:17:32.263007 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4\": container with ID starting with 84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4 not found: ID does not exist" containerID="84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.263043 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4"} err="failed to get container status \"84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4\": rpc error: code = NotFound desc = could not find container \"84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4\": container with ID starting with 84f20be0badfce9bf1cd3cad9d3c9b0d764815f70f03808c5e23b74ce497c4c4 not found: ID does not exist" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.263065 4728 scope.go:117] "RemoveContainer" containerID="9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116" Dec 16 15:17:32 crc kubenswrapper[4728]: E1216 15:17:32.263254 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116\": container with ID starting with 9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116 not found: ID does not exist" containerID="9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.263277 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116"} err="failed to get container status \"9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116\": rpc error: code = NotFound desc = could not find container \"9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116\": container with ID starting with 9a27130df63902ff028345c251e71569440640e5c85f55de2190c27ba7037116 not found: ID does not exist" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.274374 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:32 crc kubenswrapper[4728]: E1216 15:17:32.274854 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27a5934-36e6-4c83-add1-e362af6bf332" containerName="nova-scheduler-scheduler" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.274876 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27a5934-36e6-4c83-add1-e362af6bf332" containerName="nova-scheduler-scheduler" Dec 16 15:17:32 crc kubenswrapper[4728]: E1216 15:17:32.274903 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-log" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.274910 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-log" Dec 16 15:17:32 crc kubenswrapper[4728]: E1216 15:17:32.274921 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-metadata" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.274926 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-metadata" Dec 16 15:17:32 crc kubenswrapper[4728]: E1216 15:17:32.274938 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115a66a-589a-4762-b4d8-d80146360675" containerName="nova-manage" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.274945 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115a66a-589a-4762-b4d8-d80146360675" containerName="nova-manage" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.275112 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27a5934-36e6-4c83-add1-e362af6bf332" containerName="nova-scheduler-scheduler" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.275133 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115a66a-589a-4762-b4d8-d80146360675" containerName="nova-manage" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.275143 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-metadata" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.275158 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" containerName="nova-metadata-log" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.275833 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.278077 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.283386 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.299009 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.307881 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.319745 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.321262 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.322751 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.323718 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.328950 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.333569 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkw79\" (UniqueName: \"kubernetes.io/projected/ba3dbe8b-ebcb-47f8-8f81-924aec84c326-kube-api-access-rkw79\") pod \"nova-scheduler-0\" (UID: \"ba3dbe8b-ebcb-47f8-8f81-924aec84c326\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.333641 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3dbe8b-ebcb-47f8-8f81-924aec84c326-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba3dbe8b-ebcb-47f8-8f81-924aec84c326\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.333873 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba3dbe8b-ebcb-47f8-8f81-924aec84c326-config-data\") pod \"nova-scheduler-0\" (UID: \"ba3dbe8b-ebcb-47f8-8f81-924aec84c326\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.436049 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.436390 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-config-data\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.436552 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3dbe8b-ebcb-47f8-8f81-924aec84c326-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba3dbe8b-ebcb-47f8-8f81-924aec84c326\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.436708 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9khxh\" (UniqueName: \"kubernetes.io/projected/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-kube-api-access-9khxh\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.436982 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-logs\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.437190 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba3dbe8b-ebcb-47f8-8f81-924aec84c326-config-data\") pod \"nova-scheduler-0\" (UID: \"ba3dbe8b-ebcb-47f8-8f81-924aec84c326\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.437342 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkw79\" (UniqueName: \"kubernetes.io/projected/ba3dbe8b-ebcb-47f8-8f81-924aec84c326-kube-api-access-rkw79\") pod \"nova-scheduler-0\" (UID: \"ba3dbe8b-ebcb-47f8-8f81-924aec84c326\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.437475 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.442317 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba3dbe8b-ebcb-47f8-8f81-924aec84c326-config-data\") pod \"nova-scheduler-0\" (UID: \"ba3dbe8b-ebcb-47f8-8f81-924aec84c326\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.444171 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3dbe8b-ebcb-47f8-8f81-924aec84c326-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba3dbe8b-ebcb-47f8-8f81-924aec84c326\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.472059 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkw79\" (UniqueName: \"kubernetes.io/projected/ba3dbe8b-ebcb-47f8-8f81-924aec84c326-kube-api-access-rkw79\") pod \"nova-scheduler-0\" (UID: \"ba3dbe8b-ebcb-47f8-8f81-924aec84c326\") " pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.538790 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.538839 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.538859 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-config-data\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.538882 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9khxh\" (UniqueName: \"kubernetes.io/projected/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-kube-api-access-9khxh\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.538958 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-logs\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.539339 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-logs\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.544216 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.544940 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.546071 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-config-data\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.557921 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9khxh\" (UniqueName: \"kubernetes.io/projected/f1f3cd14-f2b0-4fde-a31e-e686b154eb77-kube-api-access-9khxh\") pod \"nova-metadata-0\" (UID: \"f1f3cd14-f2b0-4fde-a31e-e686b154eb77\") " pod="openstack/nova-metadata-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.596025 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 16 15:17:32 crc kubenswrapper[4728]: I1216 15:17:32.640803 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 16 15:17:33 crc kubenswrapper[4728]: W1216 15:17:33.063745 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba3dbe8b_ebcb_47f8_8f81_924aec84c326.slice/crio-771ee9cae5f8f536885f8e1da488a3d1eec623f0bc1b60482dfff4a132b20560 WatchSource:0}: Error finding container 771ee9cae5f8f536885f8e1da488a3d1eec623f0bc1b60482dfff4a132b20560: Status 404 returned error can't find the container with id 771ee9cae5f8f536885f8e1da488a3d1eec623f0bc1b60482dfff4a132b20560 Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.068113 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.075571 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.129873 4728 generic.go:334] "Generic (PLEG): container finished" podID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerID="4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41" exitCode=0 Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.129914 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.129945 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91fcaf90-dcf6-472f-a45d-58749d47bf43","Type":"ContainerDied","Data":"4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41"} Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.129999 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91fcaf90-dcf6-472f-a45d-58749d47bf43","Type":"ContainerDied","Data":"93805e4fcce6f8a09ce8f476e0e67787142952878b6963a85471dac4ff1f8b26"} Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.130026 4728 scope.go:117] "RemoveContainer" containerID="4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.132795 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba3dbe8b-ebcb-47f8-8f81-924aec84c326","Type":"ContainerStarted","Data":"771ee9cae5f8f536885f8e1da488a3d1eec623f0bc1b60482dfff4a132b20560"} Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.154492 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-internal-tls-certs\") pod \"91fcaf90-dcf6-472f-a45d-58749d47bf43\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.154633 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfnnz\" (UniqueName: \"kubernetes.io/projected/91fcaf90-dcf6-472f-a45d-58749d47bf43-kube-api-access-zfnnz\") pod \"91fcaf90-dcf6-472f-a45d-58749d47bf43\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.154991 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-config-data\") pod \"91fcaf90-dcf6-472f-a45d-58749d47bf43\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.155068 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-public-tls-certs\") pod \"91fcaf90-dcf6-472f-a45d-58749d47bf43\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.155099 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91fcaf90-dcf6-472f-a45d-58749d47bf43-logs\") pod \"91fcaf90-dcf6-472f-a45d-58749d47bf43\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.155223 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-combined-ca-bundle\") pod \"91fcaf90-dcf6-472f-a45d-58749d47bf43\" (UID: \"91fcaf90-dcf6-472f-a45d-58749d47bf43\") " Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.155617 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91fcaf90-dcf6-472f-a45d-58749d47bf43-logs" (OuterVolumeSpecName: "logs") pod "91fcaf90-dcf6-472f-a45d-58749d47bf43" (UID: "91fcaf90-dcf6-472f-a45d-58749d47bf43"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.164822 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91fcaf90-dcf6-472f-a45d-58749d47bf43-kube-api-access-zfnnz" (OuterVolumeSpecName: "kube-api-access-zfnnz") pod "91fcaf90-dcf6-472f-a45d-58749d47bf43" (UID: "91fcaf90-dcf6-472f-a45d-58749d47bf43"). InnerVolumeSpecName "kube-api-access-zfnnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.165813 4728 scope.go:117] "RemoveContainer" containerID="0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.192392 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91fcaf90-dcf6-472f-a45d-58749d47bf43" (UID: "91fcaf90-dcf6-472f-a45d-58749d47bf43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.196859 4728 scope.go:117] "RemoveContainer" containerID="4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41" Dec 16 15:17:33 crc kubenswrapper[4728]: E1216 15:17:33.197346 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41\": container with ID starting with 4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41 not found: ID does not exist" containerID="4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.197380 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41"} err="failed to get container status \"4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41\": rpc error: code = NotFound desc = could not find container \"4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41\": container with ID starting with 4659f9a4f0060656a977d9c5def321903abbc80b42147aff293e2802723d0b41 not found: ID does not exist" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.197424 4728 scope.go:117] "RemoveContainer" containerID="0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f" Dec 16 15:17:33 crc kubenswrapper[4728]: E1216 15:17:33.197780 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f\": container with ID starting with 0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f not found: ID does not exist" containerID="0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.197826 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f"} err="failed to get container status \"0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f\": rpc error: code = NotFound desc = could not find container \"0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f\": container with ID starting with 0ba1145820a8ac3c83c5db7465f96081249e685aeb61db2dfe75b407c5c6c30f not found: ID does not exist" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.200376 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-config-data" (OuterVolumeSpecName: "config-data") pod "91fcaf90-dcf6-472f-a45d-58749d47bf43" (UID: "91fcaf90-dcf6-472f-a45d-58749d47bf43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.222752 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "91fcaf90-dcf6-472f-a45d-58749d47bf43" (UID: "91fcaf90-dcf6-472f-a45d-58749d47bf43"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.244207 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.248001 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "91fcaf90-dcf6-472f-a45d-58749d47bf43" (UID: "91fcaf90-dcf6-472f-a45d-58749d47bf43"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:17:33 crc kubenswrapper[4728]: W1216 15:17:33.249098 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1f3cd14_f2b0_4fde_a31e_e686b154eb77.slice/crio-ec355abb0e593a7e1fddc4937d81bbd000de714f72e6c9032a95513abdaec4c8 WatchSource:0}: Error finding container ec355abb0e593a7e1fddc4937d81bbd000de714f72e6c9032a95513abdaec4c8: Status 404 returned error can't find the container with id ec355abb0e593a7e1fddc4937d81bbd000de714f72e6c9032a95513abdaec4c8 Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.257018 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfnnz\" (UniqueName: \"kubernetes.io/projected/91fcaf90-dcf6-472f-a45d-58749d47bf43-kube-api-access-zfnnz\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.257045 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.257060 4728 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.257071 4728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91fcaf90-dcf6-472f-a45d-58749d47bf43-logs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.257080 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.257089 4728 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/91fcaf90-dcf6-472f-a45d-58749d47bf43-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.479552 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.488488 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.495613 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:33 crc kubenswrapper[4728]: E1216 15:17:33.495982 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerName="nova-api-api" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.495999 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerName="nova-api-api" Dec 16 15:17:33 crc kubenswrapper[4728]: E1216 15:17:33.496018 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerName="nova-api-log" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.496024 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerName="nova-api-log" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.496174 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerName="nova-api-log" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.496192 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="91fcaf90-dcf6-472f-a45d-58749d47bf43" containerName="nova-api-api" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.497073 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.499116 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.499374 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.500534 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.527587 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c71d61c-fde8-4ba1-a572-aac714b424fe" path="/var/lib/kubelet/pods/3c71d61c-fde8-4ba1-a572-aac714b424fe/volumes" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.528437 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91fcaf90-dcf6-472f-a45d-58749d47bf43" path="/var/lib/kubelet/pods/91fcaf90-dcf6-472f-a45d-58749d47bf43/volumes" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.529146 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a27a5934-36e6-4c83-add1-e362af6bf332" path="/var/lib/kubelet/pods/a27a5934-36e6-4c83-add1-e362af6bf332/volumes" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.530389 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.561385 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.561456 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-config-data\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.561501 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.561522 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c311506-90af-4f99-867d-aa1f1b5d2d74-logs\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.562452 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5d5d\" (UniqueName: \"kubernetes.io/projected/8c311506-90af-4f99-867d-aa1f1b5d2d74-kube-api-access-b5d5d\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.562568 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.664376 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.664747 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-config-data\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.664788 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.664812 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c311506-90af-4f99-867d-aa1f1b5d2d74-logs\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.665125 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5d5d\" (UniqueName: \"kubernetes.io/projected/8c311506-90af-4f99-867d-aa1f1b5d2d74-kube-api-access-b5d5d\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.665220 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.665337 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c311506-90af-4f99-867d-aa1f1b5d2d74-logs\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.668597 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.681382 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.681648 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-config-data\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.681901 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c311506-90af-4f99-867d-aa1f1b5d2d74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.683795 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5d5d\" (UniqueName: \"kubernetes.io/projected/8c311506-90af-4f99-867d-aa1f1b5d2d74-kube-api-access-b5d5d\") pod \"nova-api-0\" (UID: \"8c311506-90af-4f99-867d-aa1f1b5d2d74\") " pod="openstack/nova-api-0" Dec 16 15:17:33 crc kubenswrapper[4728]: I1216 15:17:33.930147 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 16 15:17:34 crc kubenswrapper[4728]: I1216 15:17:34.149932 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1f3cd14-f2b0-4fde-a31e-e686b154eb77","Type":"ContainerStarted","Data":"8324ab2e0782f9a495b7e9912ac7ce5aae4e18f31ea22ad2ca99b9a0ca9f07a0"} Dec 16 15:17:34 crc kubenswrapper[4728]: I1216 15:17:34.150285 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1f3cd14-f2b0-4fde-a31e-e686b154eb77","Type":"ContainerStarted","Data":"9f67f7db05be3a624832dabdb1fbe9daf25e65a130e149c5937aaa1578cf5c08"} Dec 16 15:17:34 crc kubenswrapper[4728]: I1216 15:17:34.150300 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1f3cd14-f2b0-4fde-a31e-e686b154eb77","Type":"ContainerStarted","Data":"ec355abb0e593a7e1fddc4937d81bbd000de714f72e6c9032a95513abdaec4c8"} Dec 16 15:17:34 crc kubenswrapper[4728]: I1216 15:17:34.153567 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba3dbe8b-ebcb-47f8-8f81-924aec84c326","Type":"ContainerStarted","Data":"2bbbf39a6f2938d0bae8dd977f5decd179d81e3f05991418d7991f548d57f3f7"} Dec 16 15:17:34 crc kubenswrapper[4728]: I1216 15:17:34.192899 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.192874092 podStartE2EDuration="2.192874092s" podCreationTimestamp="2025-12-16 15:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:34.173742979 +0000 UTC m=+1235.013921963" watchObservedRunningTime="2025-12-16 15:17:34.192874092 +0000 UTC m=+1235.033053076" Dec 16 15:17:34 crc kubenswrapper[4728]: I1216 15:17:34.398377 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 16 15:17:35 crc kubenswrapper[4728]: I1216 15:17:35.168380 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c311506-90af-4f99-867d-aa1f1b5d2d74","Type":"ContainerStarted","Data":"ccaf368dcc0e320ef949e4bb74b7713b4d7d8d085eb8325e08860bf36cb09ba1"} Dec 16 15:17:35 crc kubenswrapper[4728]: I1216 15:17:35.169676 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c311506-90af-4f99-867d-aa1f1b5d2d74","Type":"ContainerStarted","Data":"aa0eab9baab046ec5ebb73f58582561e6bd4088c298cf9827971910370caee6c"} Dec 16 15:17:35 crc kubenswrapper[4728]: I1216 15:17:35.198473 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.198454904 podStartE2EDuration="3.198454904s" podCreationTimestamp="2025-12-16 15:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:35.195050001 +0000 UTC m=+1236.035229005" watchObservedRunningTime="2025-12-16 15:17:35.198454904 +0000 UTC m=+1236.038633898" Dec 16 15:17:36 crc kubenswrapper[4728]: I1216 15:17:36.189038 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c311506-90af-4f99-867d-aa1f1b5d2d74","Type":"ContainerStarted","Data":"dc678b79a4adee307a349106688de167184ff86fe53279d73202a7b04bf489e0"} Dec 16 15:17:36 crc kubenswrapper[4728]: I1216 15:17:36.232970 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.232940157 podStartE2EDuration="3.232940157s" podCreationTimestamp="2025-12-16 15:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:17:36.227879498 +0000 UTC m=+1237.068058512" watchObservedRunningTime="2025-12-16 15:17:36.232940157 +0000 UTC m=+1237.073119171" Dec 16 15:17:37 crc kubenswrapper[4728]: I1216 15:17:37.597804 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 16 15:17:37 crc kubenswrapper[4728]: I1216 15:17:37.642690 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 15:17:37 crc kubenswrapper[4728]: I1216 15:17:37.642776 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 16 15:17:38 crc kubenswrapper[4728]: I1216 15:17:38.819294 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:17:38 crc kubenswrapper[4728]: I1216 15:17:38.819476 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:17:42 crc kubenswrapper[4728]: I1216 15:17:42.597127 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 16 15:17:42 crc kubenswrapper[4728]: I1216 15:17:42.641459 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 15:17:42 crc kubenswrapper[4728]: I1216 15:17:42.641523 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 16 15:17:42 crc kubenswrapper[4728]: I1216 15:17:42.646677 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4728]: I1216 15:17:43.317739 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 16 15:17:43 crc kubenswrapper[4728]: I1216 15:17:43.659643 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f1f3cd14-f2b0-4fde-a31e-e686b154eb77" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:43 crc kubenswrapper[4728]: I1216 15:17:43.667674 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f1f3cd14-f2b0-4fde-a31e-e686b154eb77" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:43 crc kubenswrapper[4728]: I1216 15:17:43.930863 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:17:43 crc kubenswrapper[4728]: I1216 15:17:43.932238 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 16 15:17:44 crc kubenswrapper[4728]: I1216 15:17:44.937845 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8c311506-90af-4f99-867d-aa1f1b5d2d74" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:44 crc kubenswrapper[4728]: I1216 15:17:44.943592 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8c311506-90af-4f99-867d-aa1f1b5d2d74" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 15:17:52 crc kubenswrapper[4728]: I1216 15:17:52.649935 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 15:17:52 crc kubenswrapper[4728]: I1216 15:17:52.655376 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 16 15:17:52 crc kubenswrapper[4728]: I1216 15:17:52.663341 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 15:17:53 crc kubenswrapper[4728]: I1216 15:17:53.381865 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 16 15:17:53 crc kubenswrapper[4728]: I1216 15:17:53.709719 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 16 15:17:53 crc kubenswrapper[4728]: I1216 15:17:53.936717 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 15:17:53 crc kubenswrapper[4728]: I1216 15:17:53.937141 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 15:17:53 crc kubenswrapper[4728]: I1216 15:17:53.937351 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 16 15:17:53 crc kubenswrapper[4728]: I1216 15:17:53.942448 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 15:17:54 crc kubenswrapper[4728]: I1216 15:17:54.383936 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 16 15:17:54 crc kubenswrapper[4728]: I1216 15:17:54.391993 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 16 15:18:03 crc kubenswrapper[4728]: I1216 15:18:03.069666 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:18:04 crc kubenswrapper[4728]: I1216 15:18:04.043283 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:18:07 crc kubenswrapper[4728]: I1216 15:18:07.357310 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="42b12213-b2ec-4fa5-b848-d06fe7855247" containerName="rabbitmq" containerID="cri-o://69d182808d85ef8d13fed9a1b0ac19d7d1bef637754fda7f1fa2fcc8415b1b1e" gracePeriod=604796 Dec 16 15:18:08 crc kubenswrapper[4728]: I1216 15:18:08.803128 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="31e565e7-a84a-436e-bc5d-dc107a42ef0f" containerName="rabbitmq" containerID="cri-o://7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6" gracePeriod=604796 Dec 16 15:18:08 crc kubenswrapper[4728]: I1216 15:18:08.819594 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:18:08 crc kubenswrapper[4728]: I1216 15:18:08.819673 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:18:08 crc kubenswrapper[4728]: I1216 15:18:08.819723 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:18:08 crc kubenswrapper[4728]: I1216 15:18:08.820612 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b1b468897a2b4285ac91242e60a4e7ba38f4d070d647de3374233d1332ee4a0d"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:18:08 crc kubenswrapper[4728]: I1216 15:18:08.820672 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://b1b468897a2b4285ac91242e60a4e7ba38f4d070d647de3374233d1332ee4a0d" gracePeriod=600 Dec 16 15:18:09 crc kubenswrapper[4728]: I1216 15:18:09.548586 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="b1b468897a2b4285ac91242e60a4e7ba38f4d070d647de3374233d1332ee4a0d" exitCode=0 Dec 16 15:18:09 crc kubenswrapper[4728]: I1216 15:18:09.549094 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"b1b468897a2b4285ac91242e60a4e7ba38f4d070d647de3374233d1332ee4a0d"} Dec 16 15:18:09 crc kubenswrapper[4728]: I1216 15:18:09.549126 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"cf4555b97afbd3b3d2de44b030a2e6b901aec1b1c9811cdabf788a725a3bd7ca"} Dec 16 15:18:09 crc kubenswrapper[4728]: I1216 15:18:09.549144 4728 scope.go:117] "RemoveContainer" containerID="5f528a37171fd283501ab52158c0534c2dc70337f5ffb233b47cd1885a45c673" Dec 16 15:18:13 crc kubenswrapper[4728]: I1216 15:18:13.589849 4728 generic.go:334] "Generic (PLEG): container finished" podID="42b12213-b2ec-4fa5-b848-d06fe7855247" containerID="69d182808d85ef8d13fed9a1b0ac19d7d1bef637754fda7f1fa2fcc8415b1b1e" exitCode=0 Dec 16 15:18:13 crc kubenswrapper[4728]: I1216 15:18:13.590476 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b12213-b2ec-4fa5-b848-d06fe7855247","Type":"ContainerDied","Data":"69d182808d85ef8d13fed9a1b0ac19d7d1bef637754fda7f1fa2fcc8415b1b1e"} Dec 16 15:18:13 crc kubenswrapper[4728]: I1216 15:18:13.970146 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.063960 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-plugins\") pod \"42b12213-b2ec-4fa5-b848-d06fe7855247\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.064022 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-config-data\") pod \"42b12213-b2ec-4fa5-b848-d06fe7855247\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.064039 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-plugins-conf\") pod \"42b12213-b2ec-4fa5-b848-d06fe7855247\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.064071 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-confd\") pod \"42b12213-b2ec-4fa5-b848-d06fe7855247\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.064124 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzml2\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-kube-api-access-rzml2\") pod \"42b12213-b2ec-4fa5-b848-d06fe7855247\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.064164 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-server-conf\") pod \"42b12213-b2ec-4fa5-b848-d06fe7855247\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.064226 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-erlang-cookie\") pod \"42b12213-b2ec-4fa5-b848-d06fe7855247\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.064246 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42b12213-b2ec-4fa5-b848-d06fe7855247-erlang-cookie-secret\") pod \"42b12213-b2ec-4fa5-b848-d06fe7855247\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.064277 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"42b12213-b2ec-4fa5-b848-d06fe7855247\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.064337 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-tls\") pod \"42b12213-b2ec-4fa5-b848-d06fe7855247\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.064388 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42b12213-b2ec-4fa5-b848-d06fe7855247-pod-info\") pod \"42b12213-b2ec-4fa5-b848-d06fe7855247\" (UID: \"42b12213-b2ec-4fa5-b848-d06fe7855247\") " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.064810 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "42b12213-b2ec-4fa5-b848-d06fe7855247" (UID: "42b12213-b2ec-4fa5-b848-d06fe7855247"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.069747 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "42b12213-b2ec-4fa5-b848-d06fe7855247" (UID: "42b12213-b2ec-4fa5-b848-d06fe7855247"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.070708 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b12213-b2ec-4fa5-b848-d06fe7855247-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "42b12213-b2ec-4fa5-b848-d06fe7855247" (UID: "42b12213-b2ec-4fa5-b848-d06fe7855247"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.071510 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "42b12213-b2ec-4fa5-b848-d06fe7855247" (UID: "42b12213-b2ec-4fa5-b848-d06fe7855247"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.075635 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/42b12213-b2ec-4fa5-b848-d06fe7855247-pod-info" (OuterVolumeSpecName: "pod-info") pod "42b12213-b2ec-4fa5-b848-d06fe7855247" (UID: "42b12213-b2ec-4fa5-b848-d06fe7855247"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.075643 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-kube-api-access-rzml2" (OuterVolumeSpecName: "kube-api-access-rzml2") pod "42b12213-b2ec-4fa5-b848-d06fe7855247" (UID: "42b12213-b2ec-4fa5-b848-d06fe7855247"). InnerVolumeSpecName "kube-api-access-rzml2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.118731 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "42b12213-b2ec-4fa5-b848-d06fe7855247" (UID: "42b12213-b2ec-4fa5-b848-d06fe7855247"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.118919 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "42b12213-b2ec-4fa5-b848-d06fe7855247" (UID: "42b12213-b2ec-4fa5-b848-d06fe7855247"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.166927 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.166958 4728 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42b12213-b2ec-4fa5-b848-d06fe7855247-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.166983 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.166995 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.167011 4728 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42b12213-b2ec-4fa5-b848-d06fe7855247-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.167022 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.167032 4728 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.167043 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzml2\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-kube-api-access-rzml2\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.202883 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.203120 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-config-data" (OuterVolumeSpecName: "config-data") pod "42b12213-b2ec-4fa5-b848-d06fe7855247" (UID: "42b12213-b2ec-4fa5-b848-d06fe7855247"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.232309 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-server-conf" (OuterVolumeSpecName: "server-conf") pod "42b12213-b2ec-4fa5-b848-d06fe7855247" (UID: "42b12213-b2ec-4fa5-b848-d06fe7855247"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.268733 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.268763 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.268771 4728 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42b12213-b2ec-4fa5-b848-d06fe7855247-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.291920 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "42b12213-b2ec-4fa5-b848-d06fe7855247" (UID: "42b12213-b2ec-4fa5-b848-d06fe7855247"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.370835 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42b12213-b2ec-4fa5-b848-d06fe7855247-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.600836 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b12213-b2ec-4fa5-b848-d06fe7855247","Type":"ContainerDied","Data":"ea98c649e14fe396963ed064767296630c30364215dc40884e12396c98ee0c43"} Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.600928 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.601173 4728 scope.go:117] "RemoveContainer" containerID="69d182808d85ef8d13fed9a1b0ac19d7d1bef637754fda7f1fa2fcc8415b1b1e" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.657042 4728 scope.go:117] "RemoveContainer" containerID="8773469a391248aa723b82a38b327739121d862fabed4fa45660e65d6ebf6b43" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.668934 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.678470 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.728942 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:18:14 crc kubenswrapper[4728]: E1216 15:18:14.729327 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b12213-b2ec-4fa5-b848-d06fe7855247" containerName="setup-container" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.729341 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b12213-b2ec-4fa5-b848-d06fe7855247" containerName="setup-container" Dec 16 15:18:14 crc kubenswrapper[4728]: E1216 15:18:14.729366 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b12213-b2ec-4fa5-b848-d06fe7855247" containerName="rabbitmq" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.729372 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b12213-b2ec-4fa5-b848-d06fe7855247" containerName="rabbitmq" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.729582 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b12213-b2ec-4fa5-b848-d06fe7855247" containerName="rabbitmq" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.730458 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.734881 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.735102 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.735124 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.735248 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.735363 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.735427 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.735653 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z52lk" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.756259 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.777584 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.777639 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e19aee19-231d-4847-9e7e-78b8745576ae-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.777680 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e19aee19-231d-4847-9e7e-78b8745576ae-config-data\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.777699 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.777717 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e19aee19-231d-4847-9e7e-78b8745576ae-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.777739 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e19aee19-231d-4847-9e7e-78b8745576ae-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.777772 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.777794 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.777826 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e19aee19-231d-4847-9e7e-78b8745576ae-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.777844 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vn6l\" (UniqueName: \"kubernetes.io/projected/e19aee19-231d-4847-9e7e-78b8745576ae-kube-api-access-6vn6l\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.777879 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.879657 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.879764 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.879813 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e19aee19-231d-4847-9e7e-78b8745576ae-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.879877 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e19aee19-231d-4847-9e7e-78b8745576ae-config-data\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.879907 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.879942 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e19aee19-231d-4847-9e7e-78b8745576ae-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.879986 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e19aee19-231d-4847-9e7e-78b8745576ae-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.880044 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.880089 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.880151 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e19aee19-231d-4847-9e7e-78b8745576ae-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.880168 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.880191 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vn6l\" (UniqueName: \"kubernetes.io/projected/e19aee19-231d-4847-9e7e-78b8745576ae-kube-api-access-6vn6l\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.881090 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e19aee19-231d-4847-9e7e-78b8745576ae-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.881111 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.881197 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e19aee19-231d-4847-9e7e-78b8745576ae-config-data\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.881453 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e19aee19-231d-4847-9e7e-78b8745576ae-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.881750 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.885967 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e19aee19-231d-4847-9e7e-78b8745576ae-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.885982 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.887934 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e19aee19-231d-4847-9e7e-78b8745576ae-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.891384 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e19aee19-231d-4847-9e7e-78b8745576ae-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.903421 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vn6l\" (UniqueName: \"kubernetes.io/projected/e19aee19-231d-4847-9e7e-78b8745576ae-kube-api-access-6vn6l\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:14 crc kubenswrapper[4728]: I1216 15:18:14.923077 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e19aee19-231d-4847-9e7e-78b8745576ae\") " pod="openstack/rabbitmq-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.047738 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.344614 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.391452 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-tls\") pod \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.391505 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-server-conf\") pod \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.391560 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31e565e7-a84a-436e-bc5d-dc107a42ef0f-pod-info\") pod \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.391591 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-plugins-conf\") pod \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.391695 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-confd\") pod \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.391750 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.391858 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-plugins\") pod \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.391887 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31e565e7-a84a-436e-bc5d-dc107a42ef0f-erlang-cookie-secret\") pod \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.391917 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-erlang-cookie\") pod \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.391957 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw574\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-kube-api-access-dw574\") pod \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.391997 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-config-data\") pod \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\" (UID: \"31e565e7-a84a-436e-bc5d-dc107a42ef0f\") " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.392901 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "31e565e7-a84a-436e-bc5d-dc107a42ef0f" (UID: "31e565e7-a84a-436e-bc5d-dc107a42ef0f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.393422 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "31e565e7-a84a-436e-bc5d-dc107a42ef0f" (UID: "31e565e7-a84a-436e-bc5d-dc107a42ef0f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.394671 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "31e565e7-a84a-436e-bc5d-dc107a42ef0f" (UID: "31e565e7-a84a-436e-bc5d-dc107a42ef0f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.397690 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/31e565e7-a84a-436e-bc5d-dc107a42ef0f-pod-info" (OuterVolumeSpecName: "pod-info") pod "31e565e7-a84a-436e-bc5d-dc107a42ef0f" (UID: "31e565e7-a84a-436e-bc5d-dc107a42ef0f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.409952 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "31e565e7-a84a-436e-bc5d-dc107a42ef0f" (UID: "31e565e7-a84a-436e-bc5d-dc107a42ef0f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.410490 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "31e565e7-a84a-436e-bc5d-dc107a42ef0f" (UID: "31e565e7-a84a-436e-bc5d-dc107a42ef0f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.412686 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-kube-api-access-dw574" (OuterVolumeSpecName: "kube-api-access-dw574") pod "31e565e7-a84a-436e-bc5d-dc107a42ef0f" (UID: "31e565e7-a84a-436e-bc5d-dc107a42ef0f"). InnerVolumeSpecName "kube-api-access-dw574". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.425144 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e565e7-a84a-436e-bc5d-dc107a42ef0f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "31e565e7-a84a-436e-bc5d-dc107a42ef0f" (UID: "31e565e7-a84a-436e-bc5d-dc107a42ef0f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.475199 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-config-data" (OuterVolumeSpecName: "config-data") pod "31e565e7-a84a-436e-bc5d-dc107a42ef0f" (UID: "31e565e7-a84a-436e-bc5d-dc107a42ef0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.498798 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw574\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-kube-api-access-dw574\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.498827 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.498838 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.498847 4728 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31e565e7-a84a-436e-bc5d-dc107a42ef0f-pod-info\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.498854 4728 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.498875 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.498884 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.498894 4728 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31e565e7-a84a-436e-bc5d-dc107a42ef0f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.498903 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.501023 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-server-conf" (OuterVolumeSpecName: "server-conf") pod "31e565e7-a84a-436e-bc5d-dc107a42ef0f" (UID: "31e565e7-a84a-436e-bc5d-dc107a42ef0f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.566749 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b12213-b2ec-4fa5-b848-d06fe7855247" path="/var/lib/kubelet/pods/42b12213-b2ec-4fa5-b848-d06fe7855247/volumes" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.570292 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-x2jwj"] Dec 16 15:18:15 crc kubenswrapper[4728]: E1216 15:18:15.571122 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e565e7-a84a-436e-bc5d-dc107a42ef0f" containerName="setup-container" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.571139 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e565e7-a84a-436e-bc5d-dc107a42ef0f" containerName="setup-container" Dec 16 15:18:15 crc kubenswrapper[4728]: E1216 15:18:15.571150 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e565e7-a84a-436e-bc5d-dc107a42ef0f" containerName="rabbitmq" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.571156 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e565e7-a84a-436e-bc5d-dc107a42ef0f" containerName="rabbitmq" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.571533 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e565e7-a84a-436e-bc5d-dc107a42ef0f" containerName="rabbitmq" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.573476 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.577231 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.590383 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-x2jwj"] Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.606742 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.608161 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.608478 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-config\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.608573 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.608733 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.608893 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.609078 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr8gn\" (UniqueName: \"kubernetes.io/projected/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-kube-api-access-fr8gn\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.609441 4728 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31e565e7-a84a-436e-bc5d-dc107a42ef0f-server-conf\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.615960 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.637152 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.646087 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "31e565e7-a84a-436e-bc5d-dc107a42ef0f" (UID: "31e565e7-a84a-436e-bc5d-dc107a42ef0f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.653231 4728 generic.go:334] "Generic (PLEG): container finished" podID="31e565e7-a84a-436e-bc5d-dc107a42ef0f" containerID="7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6" exitCode=0 Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.653317 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31e565e7-a84a-436e-bc5d-dc107a42ef0f","Type":"ContainerDied","Data":"7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6"} Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.653344 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31e565e7-a84a-436e-bc5d-dc107a42ef0f","Type":"ContainerDied","Data":"eb936f4744cb9c19394b2cb4f41d59227a74fd9327a19cdbb5053c25cd272ee5"} Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.653362 4728 scope.go:117] "RemoveContainer" containerID="7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.653519 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.711208 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.711301 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.711458 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-config\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.712768 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-config\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.712724 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.712216 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.712778 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.713827 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.713906 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.713942 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.714974 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr8gn\" (UniqueName: \"kubernetes.io/projected/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-kube-api-access-fr8gn\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.715092 4728 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31e565e7-a84a-436e-bc5d-dc107a42ef0f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.715107 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.714823 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.715744 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.732134 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr8gn\" (UniqueName: \"kubernetes.io/projected/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-kube-api-access-fr8gn\") pod \"dnsmasq-dns-79bd4cc8c9-x2jwj\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.795348 4728 scope.go:117] "RemoveContainer" containerID="25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.826763 4728 scope.go:117] "RemoveContainer" containerID="7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6" Dec 16 15:18:15 crc kubenswrapper[4728]: E1216 15:18:15.827642 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6\": container with ID starting with 7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6 not found: ID does not exist" containerID="7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.827689 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6"} err="failed to get container status \"7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6\": rpc error: code = NotFound desc = could not find container \"7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6\": container with ID starting with 7388f0bc656be30aa78edb892d189c00f38ce5d250825fe92c42b64091ef88d6 not found: ID does not exist" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.827718 4728 scope.go:117] "RemoveContainer" containerID="25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5" Dec 16 15:18:15 crc kubenswrapper[4728]: E1216 15:18:15.828450 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5\": container with ID starting with 25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5 not found: ID does not exist" containerID="25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.828497 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5"} err="failed to get container status \"25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5\": rpc error: code = NotFound desc = could not find container \"25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5\": container with ID starting with 25f7c34371ad1518f2f31cdbe10040f470fc6c9515fa0e175e6a6c90a70878a5 not found: ID does not exist" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.833613 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.853503 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.869955 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.871519 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.874661 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.874866 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.875019 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-tl7xr" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.875197 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.875340 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.875676 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.882778 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.894240 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.895360 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.918113 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.918148 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.918178 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.918220 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.918238 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.918264 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.918310 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.918328 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.918342 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5js6v\" (UniqueName: \"kubernetes.io/projected/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-kube-api-access-5js6v\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.918359 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:15 crc kubenswrapper[4728]: I1216 15:18:15.918382 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.021588 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.022032 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.022115 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5js6v\" (UniqueName: \"kubernetes.io/projected/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-kube-api-access-5js6v\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.022191 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.022287 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.022448 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.022517 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.022612 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.022710 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.022776 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.022782 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.022890 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.023319 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.023603 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.024201 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.024832 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.027117 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.038831 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.039989 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.040955 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.042293 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.048391 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5js6v\" (UniqueName: \"kubernetes.io/projected/e64ff4ca-1141-477e-8db1-b2068e3b6d9a-kube-api-access-5js6v\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.070223 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e64ff4ca-1141-477e-8db1-b2068e3b6d9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.213534 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.384550 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-x2jwj"] Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.674656 4728 generic.go:334] "Generic (PLEG): container finished" podID="343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" containerID="10418afb645e15c45538d0f15a6479d1a5aa2fca8b657b6aafcb24d18712547b" exitCode=0 Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.675009 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" event={"ID":"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed","Type":"ContainerDied","Data":"10418afb645e15c45538d0f15a6479d1a5aa2fca8b657b6aafcb24d18712547b"} Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.675040 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" event={"ID":"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed","Type":"ContainerStarted","Data":"e2f326078dd8220d9926b8e5224342c9067215ab044c3b9d87447bab98f244e5"} Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.678742 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e19aee19-231d-4847-9e7e-78b8745576ae","Type":"ContainerStarted","Data":"d1ab773113ca5201fb1b07cbc572c9030099109d306360023ae004d66d052365"} Dec 16 15:18:16 crc kubenswrapper[4728]: I1216 15:18:16.678928 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 16 15:18:16 crc kubenswrapper[4728]: W1216 15:18:16.741382 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode64ff4ca_1141_477e_8db1_b2068e3b6d9a.slice/crio-747296a10b96aa1304d3902829304f0b00db22f65cb4a616382874a7768dda9f WatchSource:0}: Error finding container 747296a10b96aa1304d3902829304f0b00db22f65cb4a616382874a7768dda9f: Status 404 returned error can't find the container with id 747296a10b96aa1304d3902829304f0b00db22f65cb4a616382874a7768dda9f Dec 16 15:18:17 crc kubenswrapper[4728]: I1216 15:18:17.519042 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e565e7-a84a-436e-bc5d-dc107a42ef0f" path="/var/lib/kubelet/pods/31e565e7-a84a-436e-bc5d-dc107a42ef0f/volumes" Dec 16 15:18:17 crc kubenswrapper[4728]: I1216 15:18:17.693607 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e19aee19-231d-4847-9e7e-78b8745576ae","Type":"ContainerStarted","Data":"c0fa4ab00aae131d629422d1c90b250a0bc87a1811716e09d537baa47a369201"} Dec 16 15:18:17 crc kubenswrapper[4728]: I1216 15:18:17.694683 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e64ff4ca-1141-477e-8db1-b2068e3b6d9a","Type":"ContainerStarted","Data":"747296a10b96aa1304d3902829304f0b00db22f65cb4a616382874a7768dda9f"} Dec 16 15:18:17 crc kubenswrapper[4728]: I1216 15:18:17.696712 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" event={"ID":"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed","Type":"ContainerStarted","Data":"7fd4c57ca1c257a6044cc34fb8b602e0f90ece2849abfe05780c2ebadb0477c5"} Dec 16 15:18:17 crc kubenswrapper[4728]: I1216 15:18:17.696959 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:17 crc kubenswrapper[4728]: I1216 15:18:17.740535 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" podStartSLOduration=2.740513574 podStartE2EDuration="2.740513574s" podCreationTimestamp="2025-12-16 15:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:18:17.733910414 +0000 UTC m=+1278.574089418" watchObservedRunningTime="2025-12-16 15:18:17.740513574 +0000 UTC m=+1278.580692578" Dec 16 15:18:18 crc kubenswrapper[4728]: I1216 15:18:18.719625 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e64ff4ca-1141-477e-8db1-b2068e3b6d9a","Type":"ContainerStarted","Data":"c0c45be4a49e59d096b150ded73b4b5ef2f193ebf1b4b99470041205f8fb4551"} Dec 16 15:18:25 crc kubenswrapper[4728]: I1216 15:18:25.897579 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:25 crc kubenswrapper[4728]: I1216 15:18:25.993301 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-l6m2w"] Dec 16 15:18:25 crc kubenswrapper[4728]: I1216 15:18:25.993614 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" podUID="d6f19c63-40d1-4c3d-9c6d-027581f63b2c" containerName="dnsmasq-dns" containerID="cri-o://aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c" gracePeriod=10 Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.176393 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-xsxxz"] Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.177923 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.201860 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-xsxxz"] Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.300932 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.301014 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-dns-svc\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.301045 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.301121 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.301147 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-config\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.301185 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.301423 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzvk4\" (UniqueName: \"kubernetes.io/projected/e4e2028c-f46c-4fd1-8dee-4fb4860de081-kube-api-access-fzvk4\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.403312 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.403750 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-config\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.403806 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.403874 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzvk4\" (UniqueName: \"kubernetes.io/projected/e4e2028c-f46c-4fd1-8dee-4fb4860de081-kube-api-access-fzvk4\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.403913 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.403981 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-dns-svc\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.404022 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.404719 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-config\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.404763 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.405456 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.405550 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.405586 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.406245 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4e2028c-f46c-4fd1-8dee-4fb4860de081-dns-svc\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.425351 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzvk4\" (UniqueName: \"kubernetes.io/projected/e4e2028c-f46c-4fd1-8dee-4fb4860de081-kube-api-access-fzvk4\") pod \"dnsmasq-dns-55478c4467-xsxxz\" (UID: \"e4e2028c-f46c-4fd1-8dee-4fb4860de081\") " pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.523644 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.535625 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.708426 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-svc\") pod \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.708784 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2k27\" (UniqueName: \"kubernetes.io/projected/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-kube-api-access-d2k27\") pod \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.708836 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-sb\") pod \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.708894 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-swift-storage-0\") pod \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.708924 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-nb\") pod \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.709014 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-config\") pod \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\" (UID: \"d6f19c63-40d1-4c3d-9c6d-027581f63b2c\") " Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.732538 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-kube-api-access-d2k27" (OuterVolumeSpecName: "kube-api-access-d2k27") pod "d6f19c63-40d1-4c3d-9c6d-027581f63b2c" (UID: "d6f19c63-40d1-4c3d-9c6d-027581f63b2c"). InnerVolumeSpecName "kube-api-access-d2k27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.767274 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d6f19c63-40d1-4c3d-9c6d-027581f63b2c" (UID: "d6f19c63-40d1-4c3d-9c6d-027581f63b2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.768043 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d6f19c63-40d1-4c3d-9c6d-027581f63b2c" (UID: "d6f19c63-40d1-4c3d-9c6d-027581f63b2c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.788052 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6f19c63-40d1-4c3d-9c6d-027581f63b2c" (UID: "d6f19c63-40d1-4c3d-9c6d-027581f63b2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.794772 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-config" (OuterVolumeSpecName: "config") pod "d6f19c63-40d1-4c3d-9c6d-027581f63b2c" (UID: "d6f19c63-40d1-4c3d-9c6d-027581f63b2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.798127 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d6f19c63-40d1-4c3d-9c6d-027581f63b2c" (UID: "d6f19c63-40d1-4c3d-9c6d-027581f63b2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.816863 4728 generic.go:334] "Generic (PLEG): container finished" podID="d6f19c63-40d1-4c3d-9c6d-027581f63b2c" containerID="aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c" exitCode=0 Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.816903 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" event={"ID":"d6f19c63-40d1-4c3d-9c6d-027581f63b2c","Type":"ContainerDied","Data":"aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c"} Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.816927 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" event={"ID":"d6f19c63-40d1-4c3d-9c6d-027581f63b2c","Type":"ContainerDied","Data":"7f6084bd837fff635fefdc76235be32415252f425e11871db0f24adc90522b95"} Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.816942 4728 scope.go:117] "RemoveContainer" containerID="aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.817044 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-l6m2w" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.820264 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.820309 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2k27\" (UniqueName: \"kubernetes.io/projected/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-kube-api-access-d2k27\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.820326 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.820342 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.820351 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.820362 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f19c63-40d1-4c3d-9c6d-027581f63b2c-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.840780 4728 scope.go:117] "RemoveContainer" containerID="f30ca02fda9164f8f0c212f9f9ae37a4dfc0dd34c3d73ec3d958d8cdb7296f99" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.863043 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-l6m2w"] Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.873633 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-l6m2w"] Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.879031 4728 scope.go:117] "RemoveContainer" containerID="aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c" Dec 16 15:18:26 crc kubenswrapper[4728]: E1216 15:18:26.879964 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c\": container with ID starting with aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c not found: ID does not exist" containerID="aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.879996 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c"} err="failed to get container status \"aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c\": rpc error: code = NotFound desc = could not find container \"aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c\": container with ID starting with aff73f10aaa8fc441c6a53cd21d040c483524f90c6d0a768bed2f63cdcab387c not found: ID does not exist" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.880020 4728 scope.go:117] "RemoveContainer" containerID="f30ca02fda9164f8f0c212f9f9ae37a4dfc0dd34c3d73ec3d958d8cdb7296f99" Dec 16 15:18:26 crc kubenswrapper[4728]: E1216 15:18:26.881163 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f30ca02fda9164f8f0c212f9f9ae37a4dfc0dd34c3d73ec3d958d8cdb7296f99\": container with ID starting with f30ca02fda9164f8f0c212f9f9ae37a4dfc0dd34c3d73ec3d958d8cdb7296f99 not found: ID does not exist" containerID="f30ca02fda9164f8f0c212f9f9ae37a4dfc0dd34c3d73ec3d958d8cdb7296f99" Dec 16 15:18:26 crc kubenswrapper[4728]: I1216 15:18:26.881186 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30ca02fda9164f8f0c212f9f9ae37a4dfc0dd34c3d73ec3d958d8cdb7296f99"} err="failed to get container status \"f30ca02fda9164f8f0c212f9f9ae37a4dfc0dd34c3d73ec3d958d8cdb7296f99\": rpc error: code = NotFound desc = could not find container \"f30ca02fda9164f8f0c212f9f9ae37a4dfc0dd34c3d73ec3d958d8cdb7296f99\": container with ID starting with f30ca02fda9164f8f0c212f9f9ae37a4dfc0dd34c3d73ec3d958d8cdb7296f99 not found: ID does not exist" Dec 16 15:18:27 crc kubenswrapper[4728]: I1216 15:18:27.006681 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-xsxxz"] Dec 16 15:18:27 crc kubenswrapper[4728]: W1216 15:18:27.011466 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4e2028c_f46c_4fd1_8dee_4fb4860de081.slice/crio-707062ddb0c635f272fae3e7e0315521d0e770c08d36d86ff305d86448bc8726 WatchSource:0}: Error finding container 707062ddb0c635f272fae3e7e0315521d0e770c08d36d86ff305d86448bc8726: Status 404 returned error can't find the container with id 707062ddb0c635f272fae3e7e0315521d0e770c08d36d86ff305d86448bc8726 Dec 16 15:18:27 crc kubenswrapper[4728]: I1216 15:18:27.522196 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f19c63-40d1-4c3d-9c6d-027581f63b2c" path="/var/lib/kubelet/pods/d6f19c63-40d1-4c3d-9c6d-027581f63b2c/volumes" Dec 16 15:18:27 crc kubenswrapper[4728]: I1216 15:18:27.831613 4728 generic.go:334] "Generic (PLEG): container finished" podID="e4e2028c-f46c-4fd1-8dee-4fb4860de081" containerID="7cb60b4f60cc2337aab01b4767372b054e3bb95e17ecc8637fce93e08289aebb" exitCode=0 Dec 16 15:18:27 crc kubenswrapper[4728]: I1216 15:18:27.831686 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-xsxxz" event={"ID":"e4e2028c-f46c-4fd1-8dee-4fb4860de081","Type":"ContainerDied","Data":"7cb60b4f60cc2337aab01b4767372b054e3bb95e17ecc8637fce93e08289aebb"} Dec 16 15:18:27 crc kubenswrapper[4728]: I1216 15:18:27.831752 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-xsxxz" event={"ID":"e4e2028c-f46c-4fd1-8dee-4fb4860de081","Type":"ContainerStarted","Data":"707062ddb0c635f272fae3e7e0315521d0e770c08d36d86ff305d86448bc8726"} Dec 16 15:18:28 crc kubenswrapper[4728]: I1216 15:18:28.843308 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-xsxxz" event={"ID":"e4e2028c-f46c-4fd1-8dee-4fb4860de081","Type":"ContainerStarted","Data":"b7350d5a9ca748b356ef98705b78bba715ba6ed0e3cbe8ef9ed5aa26360961de"} Dec 16 15:18:28 crc kubenswrapper[4728]: I1216 15:18:28.843897 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:28 crc kubenswrapper[4728]: I1216 15:18:28.869057 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-xsxxz" podStartSLOduration=2.869040436 podStartE2EDuration="2.869040436s" podCreationTimestamp="2025-12-16 15:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:18:28.863881274 +0000 UTC m=+1289.704060268" watchObservedRunningTime="2025-12-16 15:18:28.869040436 +0000 UTC m=+1289.709219410" Dec 16 15:18:36 crc kubenswrapper[4728]: I1216 15:18:36.525731 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-xsxxz" Dec 16 15:18:36 crc kubenswrapper[4728]: I1216 15:18:36.631350 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-x2jwj"] Dec 16 15:18:36 crc kubenswrapper[4728]: I1216 15:18:36.631786 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" podUID="343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" containerName="dnsmasq-dns" containerID="cri-o://7fd4c57ca1c257a6044cc34fb8b602e0f90ece2849abfe05780c2ebadb0477c5" gracePeriod=10 Dec 16 15:18:36 crc kubenswrapper[4728]: I1216 15:18:36.928811 4728 generic.go:334] "Generic (PLEG): container finished" podID="343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" containerID="7fd4c57ca1c257a6044cc34fb8b602e0f90ece2849abfe05780c2ebadb0477c5" exitCode=0 Dec 16 15:18:36 crc kubenswrapper[4728]: I1216 15:18:36.928899 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" event={"ID":"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed","Type":"ContainerDied","Data":"7fd4c57ca1c257a6044cc34fb8b602e0f90ece2849abfe05780c2ebadb0477c5"} Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.038598 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.172207 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-swift-storage-0\") pod \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.172256 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-svc\") pod \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.172297 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-config\") pod \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.172469 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-nb\") pod \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.172524 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-openstack-edpm-ipam\") pod \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.172554 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-sb\") pod \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.172594 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr8gn\" (UniqueName: \"kubernetes.io/projected/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-kube-api-access-fr8gn\") pod \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\" (UID: \"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed\") " Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.179111 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-kube-api-access-fr8gn" (OuterVolumeSpecName: "kube-api-access-fr8gn") pod "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" (UID: "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed"). InnerVolumeSpecName "kube-api-access-fr8gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.219270 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" (UID: "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.223600 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" (UID: "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.227549 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" (UID: "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.228214 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" (UID: "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.243907 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" (UID: "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.244705 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-config" (OuterVolumeSpecName: "config") pod "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" (UID: "343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.275269 4728 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.275329 4728 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.275340 4728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.275349 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.275360 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.275368 4728 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.275376 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr8gn\" (UniqueName: \"kubernetes.io/projected/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed-kube-api-access-fr8gn\") on node \"crc\" DevicePath \"\"" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.938075 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" event={"ID":"343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed","Type":"ContainerDied","Data":"e2f326078dd8220d9926b8e5224342c9067215ab044c3b9d87447bab98f244e5"} Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.938456 4728 scope.go:117] "RemoveContainer" containerID="7fd4c57ca1c257a6044cc34fb8b602e0f90ece2849abfe05780c2ebadb0477c5" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.938571 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-x2jwj" Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.968904 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-x2jwj"] Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.977164 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-x2jwj"] Dec 16 15:18:37 crc kubenswrapper[4728]: I1216 15:18:37.981959 4728 scope.go:117] "RemoveContainer" containerID="10418afb645e15c45538d0f15a6479d1a5aa2fca8b657b6aafcb24d18712547b" Dec 16 15:18:39 crc kubenswrapper[4728]: I1216 15:18:39.524475 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" path="/var/lib/kubelet/pods/343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed/volumes" Dec 16 15:18:48 crc kubenswrapper[4728]: I1216 15:18:48.998341 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx"] Dec 16 15:18:49 crc kubenswrapper[4728]: E1216 15:18:48.999195 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f19c63-40d1-4c3d-9c6d-027581f63b2c" containerName="dnsmasq-dns" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:48.999208 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f19c63-40d1-4c3d-9c6d-027581f63b2c" containerName="dnsmasq-dns" Dec 16 15:18:49 crc kubenswrapper[4728]: E1216 15:18:48.999220 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f19c63-40d1-4c3d-9c6d-027581f63b2c" containerName="init" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:48.999226 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f19c63-40d1-4c3d-9c6d-027581f63b2c" containerName="init" Dec 16 15:18:49 crc kubenswrapper[4728]: E1216 15:18:48.999242 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" containerName="init" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:48.999249 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" containerName="init" Dec 16 15:18:49 crc kubenswrapper[4728]: E1216 15:18:48.999272 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" containerName="dnsmasq-dns" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:48.999278 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" containerName="dnsmasq-dns" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:48.999497 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="343a7ae3-b43c-4ded-a0ed-9b0512d9b5ed" containerName="dnsmasq-dns" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:48.999520 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f19c63-40d1-4c3d-9c6d-027581f63b2c" containerName="dnsmasq-dns" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.000271 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.006951 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.007107 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.007116 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.007114 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.018076 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx"] Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.129463 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.129571 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.129605 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxbb5\" (UniqueName: \"kubernetes.io/projected/37e0ae2a-b0ba-45a7-9395-1af1365adf86-kube-api-access-lxbb5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.129671 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.231313 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.231490 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.231566 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.231598 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxbb5\" (UniqueName: \"kubernetes.io/projected/37e0ae2a-b0ba-45a7-9395-1af1365adf86-kube-api-access-lxbb5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.240153 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.240288 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.245122 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.255885 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxbb5\" (UniqueName: \"kubernetes.io/projected/37e0ae2a-b0ba-45a7-9395-1af1365adf86-kube-api-access-lxbb5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.330221 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:18:49 crc kubenswrapper[4728]: I1216 15:18:49.872319 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx"] Dec 16 15:18:49 crc kubenswrapper[4728]: W1216 15:18:49.877288 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e0ae2a_b0ba_45a7_9395_1af1365adf86.slice/crio-eecb8c3efe4ddd9fa873c812767b047901494ee57799942e0eee13f98ef4bb65 WatchSource:0}: Error finding container eecb8c3efe4ddd9fa873c812767b047901494ee57799942e0eee13f98ef4bb65: Status 404 returned error can't find the container with id eecb8c3efe4ddd9fa873c812767b047901494ee57799942e0eee13f98ef4bb65 Dec 16 15:18:50 crc kubenswrapper[4728]: I1216 15:18:50.082838 4728 generic.go:334] "Generic (PLEG): container finished" podID="e19aee19-231d-4847-9e7e-78b8745576ae" containerID="c0fa4ab00aae131d629422d1c90b250a0bc87a1811716e09d537baa47a369201" exitCode=0 Dec 16 15:18:50 crc kubenswrapper[4728]: I1216 15:18:50.082910 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e19aee19-231d-4847-9e7e-78b8745576ae","Type":"ContainerDied","Data":"c0fa4ab00aae131d629422d1c90b250a0bc87a1811716e09d537baa47a369201"} Dec 16 15:18:50 crc kubenswrapper[4728]: I1216 15:18:50.086002 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" event={"ID":"37e0ae2a-b0ba-45a7-9395-1af1365adf86","Type":"ContainerStarted","Data":"eecb8c3efe4ddd9fa873c812767b047901494ee57799942e0eee13f98ef4bb65"} Dec 16 15:18:51 crc kubenswrapper[4728]: I1216 15:18:51.097728 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e19aee19-231d-4847-9e7e-78b8745576ae","Type":"ContainerStarted","Data":"37750967716d34718133802f0620bcd06e8361eefadaa1c3ccc6fa8f54f23238"} Dec 16 15:18:51 crc kubenswrapper[4728]: I1216 15:18:51.098333 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 16 15:18:51 crc kubenswrapper[4728]: I1216 15:18:51.099995 4728 generic.go:334] "Generic (PLEG): container finished" podID="e64ff4ca-1141-477e-8db1-b2068e3b6d9a" containerID="c0c45be4a49e59d096b150ded73b4b5ef2f193ebf1b4b99470041205f8fb4551" exitCode=0 Dec 16 15:18:51 crc kubenswrapper[4728]: I1216 15:18:51.100050 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e64ff4ca-1141-477e-8db1-b2068e3b6d9a","Type":"ContainerDied","Data":"c0c45be4a49e59d096b150ded73b4b5ef2f193ebf1b4b99470041205f8fb4551"} Dec 16 15:18:51 crc kubenswrapper[4728]: I1216 15:18:51.152431 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.152383911 podStartE2EDuration="37.152383911s" podCreationTimestamp="2025-12-16 15:18:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:18:51.136117197 +0000 UTC m=+1311.976296201" watchObservedRunningTime="2025-12-16 15:18:51.152383911 +0000 UTC m=+1311.992562895" Dec 16 15:18:52 crc kubenswrapper[4728]: I1216 15:18:52.113764 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e64ff4ca-1141-477e-8db1-b2068e3b6d9a","Type":"ContainerStarted","Data":"0b5c4764dd40dfd627810e09ef5415163699ca621d112ed4ae4f88991b6df63d"} Dec 16 15:18:52 crc kubenswrapper[4728]: I1216 15:18:52.115393 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:18:52 crc kubenswrapper[4728]: I1216 15:18:52.142936 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.142917643 podStartE2EDuration="37.142917643s" podCreationTimestamp="2025-12-16 15:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:18:52.141137114 +0000 UTC m=+1312.981316128" watchObservedRunningTime="2025-12-16 15:18:52.142917643 +0000 UTC m=+1312.983096617" Dec 16 15:19:01 crc kubenswrapper[4728]: I1216 15:19:01.821231 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:19:02 crc kubenswrapper[4728]: I1216 15:19:02.198659 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" event={"ID":"37e0ae2a-b0ba-45a7-9395-1af1365adf86","Type":"ContainerStarted","Data":"7e31d0a697705833d66ed1a111beb5b99ec21d8d194c24b3d2dc42e393e04e45"} Dec 16 15:19:02 crc kubenswrapper[4728]: I1216 15:19:02.226734 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" podStartSLOduration=2.288814728 podStartE2EDuration="14.226710029s" podCreationTimestamp="2025-12-16 15:18:48 +0000 UTC" firstStartedPulling="2025-12-16 15:18:49.880043351 +0000 UTC m=+1310.720222335" lastFinishedPulling="2025-12-16 15:19:01.817938612 +0000 UTC m=+1322.658117636" observedRunningTime="2025-12-16 15:19:02.216590674 +0000 UTC m=+1323.056769658" watchObservedRunningTime="2025-12-16 15:19:02.226710029 +0000 UTC m=+1323.066889023" Dec 16 15:19:05 crc kubenswrapper[4728]: I1216 15:19:05.051798 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 16 15:19:06 crc kubenswrapper[4728]: I1216 15:19:06.217649 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 16 15:19:14 crc kubenswrapper[4728]: I1216 15:19:14.324072 4728 generic.go:334] "Generic (PLEG): container finished" podID="37e0ae2a-b0ba-45a7-9395-1af1365adf86" containerID="7e31d0a697705833d66ed1a111beb5b99ec21d8d194c24b3d2dc42e393e04e45" exitCode=0 Dec 16 15:19:14 crc kubenswrapper[4728]: I1216 15:19:14.324157 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" event={"ID":"37e0ae2a-b0ba-45a7-9395-1af1365adf86","Type":"ContainerDied","Data":"7e31d0a697705833d66ed1a111beb5b99ec21d8d194c24b3d2dc42e393e04e45"} Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.756012 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.787315 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-ssh-key\") pod \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.787480 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-repo-setup-combined-ca-bundle\") pod \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.787524 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxbb5\" (UniqueName: \"kubernetes.io/projected/37e0ae2a-b0ba-45a7-9395-1af1365adf86-kube-api-access-lxbb5\") pod \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.787569 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-inventory\") pod \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\" (UID: \"37e0ae2a-b0ba-45a7-9395-1af1365adf86\") " Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.793922 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "37e0ae2a-b0ba-45a7-9395-1af1365adf86" (UID: "37e0ae2a-b0ba-45a7-9395-1af1365adf86"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.797973 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e0ae2a-b0ba-45a7-9395-1af1365adf86-kube-api-access-lxbb5" (OuterVolumeSpecName: "kube-api-access-lxbb5") pod "37e0ae2a-b0ba-45a7-9395-1af1365adf86" (UID: "37e0ae2a-b0ba-45a7-9395-1af1365adf86"). InnerVolumeSpecName "kube-api-access-lxbb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.816648 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "37e0ae2a-b0ba-45a7-9395-1af1365adf86" (UID: "37e0ae2a-b0ba-45a7-9395-1af1365adf86"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.818696 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-inventory" (OuterVolumeSpecName: "inventory") pod "37e0ae2a-b0ba-45a7-9395-1af1365adf86" (UID: "37e0ae2a-b0ba-45a7-9395-1af1365adf86"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.890307 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.890344 4728 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.890356 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxbb5\" (UniqueName: \"kubernetes.io/projected/37e0ae2a-b0ba-45a7-9395-1af1365adf86-kube-api-access-lxbb5\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:15 crc kubenswrapper[4728]: I1216 15:19:15.890366 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37e0ae2a-b0ba-45a7-9395-1af1365adf86-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.347661 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" event={"ID":"37e0ae2a-b0ba-45a7-9395-1af1365adf86","Type":"ContainerDied","Data":"eecb8c3efe4ddd9fa873c812767b047901494ee57799942e0eee13f98ef4bb65"} Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.348038 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eecb8c3efe4ddd9fa873c812767b047901494ee57799942e0eee13f98ef4bb65" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.347790 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.461809 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q"] Dec 16 15:19:16 crc kubenswrapper[4728]: E1216 15:19:16.462195 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e0ae2a-b0ba-45a7-9395-1af1365adf86" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.462211 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e0ae2a-b0ba-45a7-9395-1af1365adf86" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.462399 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e0ae2a-b0ba-45a7-9395-1af1365adf86" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.462999 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.465682 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.465858 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.465998 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.466106 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.480503 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q"] Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.602829 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57q8x\" (UniqueName: \"kubernetes.io/projected/447d1f35-7fe1-4655-8893-3ca4afed13d6-kube-api-access-57q8x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fbn4q\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.602921 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fbn4q\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.603318 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fbn4q\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.706428 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57q8x\" (UniqueName: \"kubernetes.io/projected/447d1f35-7fe1-4655-8893-3ca4afed13d6-kube-api-access-57q8x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fbn4q\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.706568 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fbn4q\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.706650 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fbn4q\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.715220 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fbn4q\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.715681 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fbn4q\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.737745 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57q8x\" (UniqueName: \"kubernetes.io/projected/447d1f35-7fe1-4655-8893-3ca4afed13d6-kube-api-access-57q8x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fbn4q\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:16 crc kubenswrapper[4728]: I1216 15:19:16.792445 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:17 crc kubenswrapper[4728]: I1216 15:19:17.403714 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q"] Dec 16 15:19:17 crc kubenswrapper[4728]: W1216 15:19:17.411651 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod447d1f35_7fe1_4655_8893_3ca4afed13d6.slice/crio-ed203f918f6cf9d5a658f03c579751d7cfd42c03a2f838623722ed0ca0992983 WatchSource:0}: Error finding container ed203f918f6cf9d5a658f03c579751d7cfd42c03a2f838623722ed0ca0992983: Status 404 returned error can't find the container with id ed203f918f6cf9d5a658f03c579751d7cfd42c03a2f838623722ed0ca0992983 Dec 16 15:19:18 crc kubenswrapper[4728]: I1216 15:19:18.373229 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" event={"ID":"447d1f35-7fe1-4655-8893-3ca4afed13d6","Type":"ContainerStarted","Data":"ed203f918f6cf9d5a658f03c579751d7cfd42c03a2f838623722ed0ca0992983"} Dec 16 15:19:19 crc kubenswrapper[4728]: I1216 15:19:19.386824 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" event={"ID":"447d1f35-7fe1-4655-8893-3ca4afed13d6","Type":"ContainerStarted","Data":"0682285afe2276dcd40458b378b447e2242462b5e1842ad5279204ee0bf14463"} Dec 16 15:19:19 crc kubenswrapper[4728]: I1216 15:19:19.411300 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" podStartSLOduration=2.635808043 podStartE2EDuration="3.411274499s" podCreationTimestamp="2025-12-16 15:19:16 +0000 UTC" firstStartedPulling="2025-12-16 15:19:17.414037315 +0000 UTC m=+1338.254216309" lastFinishedPulling="2025-12-16 15:19:18.189503771 +0000 UTC m=+1339.029682765" observedRunningTime="2025-12-16 15:19:19.399005413 +0000 UTC m=+1340.239184427" watchObservedRunningTime="2025-12-16 15:19:19.411274499 +0000 UTC m=+1340.251453503" Dec 16 15:19:21 crc kubenswrapper[4728]: I1216 15:19:21.411877 4728 generic.go:334] "Generic (PLEG): container finished" podID="447d1f35-7fe1-4655-8893-3ca4afed13d6" containerID="0682285afe2276dcd40458b378b447e2242462b5e1842ad5279204ee0bf14463" exitCode=0 Dec 16 15:19:21 crc kubenswrapper[4728]: I1216 15:19:21.411934 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" event={"ID":"447d1f35-7fe1-4655-8893-3ca4afed13d6","Type":"ContainerDied","Data":"0682285afe2276dcd40458b378b447e2242462b5e1842ad5279204ee0bf14463"} Dec 16 15:19:22 crc kubenswrapper[4728]: I1216 15:19:22.822970 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:22 crc kubenswrapper[4728]: I1216 15:19:22.842954 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-ssh-key\") pod \"447d1f35-7fe1-4655-8893-3ca4afed13d6\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " Dec 16 15:19:22 crc kubenswrapper[4728]: I1216 15:19:22.843225 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-inventory\") pod \"447d1f35-7fe1-4655-8893-3ca4afed13d6\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " Dec 16 15:19:22 crc kubenswrapper[4728]: I1216 15:19:22.843273 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57q8x\" (UniqueName: \"kubernetes.io/projected/447d1f35-7fe1-4655-8893-3ca4afed13d6-kube-api-access-57q8x\") pod \"447d1f35-7fe1-4655-8893-3ca4afed13d6\" (UID: \"447d1f35-7fe1-4655-8893-3ca4afed13d6\") " Dec 16 15:19:22 crc kubenswrapper[4728]: I1216 15:19:22.851090 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/447d1f35-7fe1-4655-8893-3ca4afed13d6-kube-api-access-57q8x" (OuterVolumeSpecName: "kube-api-access-57q8x") pod "447d1f35-7fe1-4655-8893-3ca4afed13d6" (UID: "447d1f35-7fe1-4655-8893-3ca4afed13d6"). InnerVolumeSpecName "kube-api-access-57q8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:19:22 crc kubenswrapper[4728]: I1216 15:19:22.875954 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-inventory" (OuterVolumeSpecName: "inventory") pod "447d1f35-7fe1-4655-8893-3ca4afed13d6" (UID: "447d1f35-7fe1-4655-8893-3ca4afed13d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:19:22 crc kubenswrapper[4728]: I1216 15:19:22.888799 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "447d1f35-7fe1-4655-8893-3ca4afed13d6" (UID: "447d1f35-7fe1-4655-8893-3ca4afed13d6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:19:22 crc kubenswrapper[4728]: I1216 15:19:22.945510 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:22 crc kubenswrapper[4728]: I1216 15:19:22.945535 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/447d1f35-7fe1-4655-8893-3ca4afed13d6-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:22 crc kubenswrapper[4728]: I1216 15:19:22.945545 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57q8x\" (UniqueName: \"kubernetes.io/projected/447d1f35-7fe1-4655-8893-3ca4afed13d6-kube-api-access-57q8x\") on node \"crc\" DevicePath \"\"" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.436081 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" event={"ID":"447d1f35-7fe1-4655-8893-3ca4afed13d6","Type":"ContainerDied","Data":"ed203f918f6cf9d5a658f03c579751d7cfd42c03a2f838623722ed0ca0992983"} Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.436121 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed203f918f6cf9d5a658f03c579751d7cfd42c03a2f838623722ed0ca0992983" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.436142 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fbn4q" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.521571 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr"] Dec 16 15:19:23 crc kubenswrapper[4728]: E1216 15:19:23.522366 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="447d1f35-7fe1-4655-8893-3ca4afed13d6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.522569 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="447d1f35-7fe1-4655-8893-3ca4afed13d6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.523070 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="447d1f35-7fe1-4655-8893-3ca4afed13d6" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.524232 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.526253 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.526631 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.526925 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.527147 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.533969 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr"] Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.558310 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.558507 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45rkb\" (UniqueName: \"kubernetes.io/projected/801eb0fd-312d-4913-8608-52baf1c65fea-kube-api-access-45rkb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.558639 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.558676 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.660182 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45rkb\" (UniqueName: \"kubernetes.io/projected/801eb0fd-312d-4913-8608-52baf1c65fea-kube-api-access-45rkb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.660755 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.661017 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.661701 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.665429 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.667282 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.669598 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.689157 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45rkb\" (UniqueName: \"kubernetes.io/projected/801eb0fd-312d-4913-8608-52baf1c65fea-kube-api-access-45rkb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:23 crc kubenswrapper[4728]: I1216 15:19:23.852467 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:19:25 crc kubenswrapper[4728]: I1216 15:19:24.423076 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr"] Dec 16 15:19:25 crc kubenswrapper[4728]: I1216 15:19:24.449450 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" event={"ID":"801eb0fd-312d-4913-8608-52baf1c65fea","Type":"ContainerStarted","Data":"c2b932493ed3f0afef535eeb360515088298028b54871ce7138ae809f66705c8"} Dec 16 15:19:26 crc kubenswrapper[4728]: I1216 15:19:26.469856 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" event={"ID":"801eb0fd-312d-4913-8608-52baf1c65fea","Type":"ContainerStarted","Data":"676bb11b46c32bc4ef34665732820321bb6c8ac1a4afbb0549a568300edff153"} Dec 16 15:19:26 crc kubenswrapper[4728]: I1216 15:19:26.492311 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" podStartSLOduration=2.6060986 podStartE2EDuration="3.49229312s" podCreationTimestamp="2025-12-16 15:19:23 +0000 UTC" firstStartedPulling="2025-12-16 15:19:24.434528243 +0000 UTC m=+1345.274707267" lastFinishedPulling="2025-12-16 15:19:25.320722773 +0000 UTC m=+1346.160901787" observedRunningTime="2025-12-16 15:19:26.486644386 +0000 UTC m=+1347.326823400" watchObservedRunningTime="2025-12-16 15:19:26.49229312 +0000 UTC m=+1347.332472104" Dec 16 15:20:15 crc kubenswrapper[4728]: I1216 15:20:15.882911 4728 scope.go:117] "RemoveContainer" containerID="7bfa0516798c743159e966ef01cdc65ffa6700edd375c99066873e5011be5539" Dec 16 15:20:15 crc kubenswrapper[4728]: I1216 15:20:15.929680 4728 scope.go:117] "RemoveContainer" containerID="f60f105b427383c69359296263469ae00ab54d3f8189124ab7a543f226cd6d2e" Dec 16 15:20:38 crc kubenswrapper[4728]: I1216 15:20:38.818465 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:20:38 crc kubenswrapper[4728]: I1216 15:20:38.819279 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:21:08 crc kubenswrapper[4728]: I1216 15:21:08.818599 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:21:08 crc kubenswrapper[4728]: I1216 15:21:08.819174 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:21:38 crc kubenswrapper[4728]: I1216 15:21:38.819279 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:21:38 crc kubenswrapper[4728]: I1216 15:21:38.821737 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:21:38 crc kubenswrapper[4728]: I1216 15:21:38.821842 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:21:38 crc kubenswrapper[4728]: I1216 15:21:38.823195 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf4555b97afbd3b3d2de44b030a2e6b901aec1b1c9811cdabf788a725a3bd7ca"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:21:38 crc kubenswrapper[4728]: I1216 15:21:38.823326 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://cf4555b97afbd3b3d2de44b030a2e6b901aec1b1c9811cdabf788a725a3bd7ca" gracePeriod=600 Dec 16 15:21:39 crc kubenswrapper[4728]: I1216 15:21:39.888894 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="cf4555b97afbd3b3d2de44b030a2e6b901aec1b1c9811cdabf788a725a3bd7ca" exitCode=0 Dec 16 15:21:39 crc kubenswrapper[4728]: I1216 15:21:39.888937 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"cf4555b97afbd3b3d2de44b030a2e6b901aec1b1c9811cdabf788a725a3bd7ca"} Dec 16 15:21:39 crc kubenswrapper[4728]: I1216 15:21:39.889524 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6"} Dec 16 15:21:39 crc kubenswrapper[4728]: I1216 15:21:39.889551 4728 scope.go:117] "RemoveContainer" containerID="b1b468897a2b4285ac91242e60a4e7ba38f4d070d647de3374233d1332ee4a0d" Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.694450 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d9964"] Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.697134 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.717552 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d9964"] Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.808529 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4qzj\" (UniqueName: \"kubernetes.io/projected/bec595b1-e989-418f-9f33-5ad1d1112e7c-kube-api-access-z4qzj\") pod \"redhat-operators-d9964\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.808587 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-catalog-content\") pod \"redhat-operators-d9964\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.808710 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-utilities\") pod \"redhat-operators-d9964\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.910492 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4qzj\" (UniqueName: \"kubernetes.io/projected/bec595b1-e989-418f-9f33-5ad1d1112e7c-kube-api-access-z4qzj\") pod \"redhat-operators-d9964\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.910723 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-catalog-content\") pod \"redhat-operators-d9964\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.910823 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-utilities\") pod \"redhat-operators-d9964\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.911361 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-catalog-content\") pod \"redhat-operators-d9964\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.911474 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-utilities\") pod \"redhat-operators-d9964\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:21:59 crc kubenswrapper[4728]: I1216 15:21:59.937355 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4qzj\" (UniqueName: \"kubernetes.io/projected/bec595b1-e989-418f-9f33-5ad1d1112e7c-kube-api-access-z4qzj\") pod \"redhat-operators-d9964\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:22:00 crc kubenswrapper[4728]: I1216 15:22:00.031435 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:22:00 crc kubenswrapper[4728]: W1216 15:22:00.595080 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbec595b1_e989_418f_9f33_5ad1d1112e7c.slice/crio-e538c8b5d35941d14f33dc1f561366799a9e4cf0fb8b3c9cd8a30d7501cf286e WatchSource:0}: Error finding container e538c8b5d35941d14f33dc1f561366799a9e4cf0fb8b3c9cd8a30d7501cf286e: Status 404 returned error can't find the container with id e538c8b5d35941d14f33dc1f561366799a9e4cf0fb8b3c9cd8a30d7501cf286e Dec 16 15:22:00 crc kubenswrapper[4728]: I1216 15:22:00.595963 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d9964"] Dec 16 15:22:01 crc kubenswrapper[4728]: I1216 15:22:01.121051 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9964" event={"ID":"bec595b1-e989-418f-9f33-5ad1d1112e7c","Type":"ContainerStarted","Data":"e538c8b5d35941d14f33dc1f561366799a9e4cf0fb8b3c9cd8a30d7501cf286e"} Dec 16 15:22:02 crc kubenswrapper[4728]: I1216 15:22:02.132976 4728 generic.go:334] "Generic (PLEG): container finished" podID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerID="7a83de0c9a9501af949121afeb1a5e6ece06169a4272d3215eea701c006242a5" exitCode=0 Dec 16 15:22:02 crc kubenswrapper[4728]: I1216 15:22:02.133110 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9964" event={"ID":"bec595b1-e989-418f-9f33-5ad1d1112e7c","Type":"ContainerDied","Data":"7a83de0c9a9501af949121afeb1a5e6ece06169a4272d3215eea701c006242a5"} Dec 16 15:22:02 crc kubenswrapper[4728]: I1216 15:22:02.135125 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:22:04 crc kubenswrapper[4728]: I1216 15:22:04.158596 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9964" event={"ID":"bec595b1-e989-418f-9f33-5ad1d1112e7c","Type":"ContainerStarted","Data":"68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d"} Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.069914 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-567nw"] Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.072258 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.084704 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-567nw"] Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.246578 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-catalog-content\") pod \"certified-operators-567nw\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.246630 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k245d\" (UniqueName: \"kubernetes.io/projected/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-kube-api-access-k245d\") pod \"certified-operators-567nw\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.246647 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-utilities\") pod \"certified-operators-567nw\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.347894 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-catalog-content\") pod \"certified-operators-567nw\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.347948 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k245d\" (UniqueName: \"kubernetes.io/projected/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-kube-api-access-k245d\") pod \"certified-operators-567nw\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.347967 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-utilities\") pod \"certified-operators-567nw\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.348544 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-catalog-content\") pod \"certified-operators-567nw\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.348545 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-utilities\") pod \"certified-operators-567nw\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.369317 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k245d\" (UniqueName: \"kubernetes.io/projected/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-kube-api-access-k245d\") pod \"certified-operators-567nw\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.403521 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:06 crc kubenswrapper[4728]: I1216 15:22:06.954308 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-567nw"] Dec 16 15:22:06 crc kubenswrapper[4728]: W1216 15:22:06.961662 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc49956_67ef_44f4_aaa0_5fd7560fd5df.slice/crio-40c06bcde0080c6238fd07a5ba12b232a7cf93e8376b0f58072e1504810a84b4 WatchSource:0}: Error finding container 40c06bcde0080c6238fd07a5ba12b232a7cf93e8376b0f58072e1504810a84b4: Status 404 returned error can't find the container with id 40c06bcde0080c6238fd07a5ba12b232a7cf93e8376b0f58072e1504810a84b4 Dec 16 15:22:07 crc kubenswrapper[4728]: I1216 15:22:07.193684 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567nw" event={"ID":"2bc49956-67ef-44f4-aaa0-5fd7560fd5df","Type":"ContainerStarted","Data":"e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48"} Dec 16 15:22:07 crc kubenswrapper[4728]: I1216 15:22:07.194056 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567nw" event={"ID":"2bc49956-67ef-44f4-aaa0-5fd7560fd5df","Type":"ContainerStarted","Data":"40c06bcde0080c6238fd07a5ba12b232a7cf93e8376b0f58072e1504810a84b4"} Dec 16 15:22:07 crc kubenswrapper[4728]: I1216 15:22:07.196308 4728 generic.go:334] "Generic (PLEG): container finished" podID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerID="68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d" exitCode=0 Dec 16 15:22:07 crc kubenswrapper[4728]: I1216 15:22:07.196372 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9964" event={"ID":"bec595b1-e989-418f-9f33-5ad1d1112e7c","Type":"ContainerDied","Data":"68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d"} Dec 16 15:22:08 crc kubenswrapper[4728]: I1216 15:22:08.210114 4728 generic.go:334] "Generic (PLEG): container finished" podID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" containerID="e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48" exitCode=0 Dec 16 15:22:08 crc kubenswrapper[4728]: I1216 15:22:08.210164 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567nw" event={"ID":"2bc49956-67ef-44f4-aaa0-5fd7560fd5df","Type":"ContainerDied","Data":"e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48"} Dec 16 15:22:09 crc kubenswrapper[4728]: I1216 15:22:09.222836 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9964" event={"ID":"bec595b1-e989-418f-9f33-5ad1d1112e7c","Type":"ContainerStarted","Data":"ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1"} Dec 16 15:22:09 crc kubenswrapper[4728]: I1216 15:22:09.225922 4728 generic.go:334] "Generic (PLEG): container finished" podID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" containerID="e8c21379e650340d234847917df7b0fe4a855d563648b9b02566ec3d2f769263" exitCode=0 Dec 16 15:22:09 crc kubenswrapper[4728]: I1216 15:22:09.225978 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567nw" event={"ID":"2bc49956-67ef-44f4-aaa0-5fd7560fd5df","Type":"ContainerDied","Data":"e8c21379e650340d234847917df7b0fe4a855d563648b9b02566ec3d2f769263"} Dec 16 15:22:09 crc kubenswrapper[4728]: I1216 15:22:09.252180 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d9964" podStartSLOduration=4.212905621 podStartE2EDuration="10.252163492s" podCreationTimestamp="2025-12-16 15:21:59 +0000 UTC" firstStartedPulling="2025-12-16 15:22:02.134801605 +0000 UTC m=+1502.974980589" lastFinishedPulling="2025-12-16 15:22:08.174059456 +0000 UTC m=+1509.014238460" observedRunningTime="2025-12-16 15:22:09.250897438 +0000 UTC m=+1510.091076432" watchObservedRunningTime="2025-12-16 15:22:09.252163492 +0000 UTC m=+1510.092342476" Dec 16 15:22:10 crc kubenswrapper[4728]: I1216 15:22:10.032182 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:22:10 crc kubenswrapper[4728]: I1216 15:22:10.032589 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:22:10 crc kubenswrapper[4728]: I1216 15:22:10.238548 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567nw" event={"ID":"2bc49956-67ef-44f4-aaa0-5fd7560fd5df","Type":"ContainerStarted","Data":"786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263"} Dec 16 15:22:11 crc kubenswrapper[4728]: I1216 15:22:11.089826 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d9964" podUID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerName="registry-server" probeResult="failure" output=< Dec 16 15:22:11 crc kubenswrapper[4728]: timeout: failed to connect service ":50051" within 1s Dec 16 15:22:11 crc kubenswrapper[4728]: > Dec 16 15:22:16 crc kubenswrapper[4728]: I1216 15:22:16.404558 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:16 crc kubenswrapper[4728]: I1216 15:22:16.405287 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:16 crc kubenswrapper[4728]: I1216 15:22:16.460723 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:16 crc kubenswrapper[4728]: I1216 15:22:16.499525 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-567nw" podStartSLOduration=7.95494641 podStartE2EDuration="10.499502232s" podCreationTimestamp="2025-12-16 15:22:06 +0000 UTC" firstStartedPulling="2025-12-16 15:22:07.195567689 +0000 UTC m=+1508.035746673" lastFinishedPulling="2025-12-16 15:22:09.740123491 +0000 UTC m=+1510.580302495" observedRunningTime="2025-12-16 15:22:10.278998285 +0000 UTC m=+1511.119177289" watchObservedRunningTime="2025-12-16 15:22:16.499502232 +0000 UTC m=+1517.339681236" Dec 16 15:22:17 crc kubenswrapper[4728]: I1216 15:22:17.396159 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:17 crc kubenswrapper[4728]: I1216 15:22:17.462234 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-567nw"] Dec 16 15:22:19 crc kubenswrapper[4728]: I1216 15:22:19.331798 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-567nw" podUID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" containerName="registry-server" containerID="cri-o://786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263" gracePeriod=2 Dec 16 15:22:20 crc kubenswrapper[4728]: I1216 15:22:20.074255 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:22:20 crc kubenswrapper[4728]: I1216 15:22:20.139206 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:22:20 crc kubenswrapper[4728]: I1216 15:22:20.309634 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d9964"] Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.082790 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.190494 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-catalog-content\") pod \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.190868 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-utilities\") pod \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.191231 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k245d\" (UniqueName: \"kubernetes.io/projected/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-kube-api-access-k245d\") pod \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\" (UID: \"2bc49956-67ef-44f4-aaa0-5fd7560fd5df\") " Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.192774 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-utilities" (OuterVolumeSpecName: "utilities") pod "2bc49956-67ef-44f4-aaa0-5fd7560fd5df" (UID: "2bc49956-67ef-44f4-aaa0-5fd7560fd5df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.203137 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-kube-api-access-k245d" (OuterVolumeSpecName: "kube-api-access-k245d") pod "2bc49956-67ef-44f4-aaa0-5fd7560fd5df" (UID: "2bc49956-67ef-44f4-aaa0-5fd7560fd5df"). InnerVolumeSpecName "kube-api-access-k245d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.280365 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bc49956-67ef-44f4-aaa0-5fd7560fd5df" (UID: "2bc49956-67ef-44f4-aaa0-5fd7560fd5df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.293736 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.293758 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.293768 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k245d\" (UniqueName: \"kubernetes.io/projected/2bc49956-67ef-44f4-aaa0-5fd7560fd5df-kube-api-access-k245d\") on node \"crc\" DevicePath \"\"" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.350930 4728 generic.go:334] "Generic (PLEG): container finished" podID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" containerID="786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263" exitCode=0 Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.350987 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567nw" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.351035 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567nw" event={"ID":"2bc49956-67ef-44f4-aaa0-5fd7560fd5df","Type":"ContainerDied","Data":"786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263"} Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.351083 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567nw" event={"ID":"2bc49956-67ef-44f4-aaa0-5fd7560fd5df","Type":"ContainerDied","Data":"40c06bcde0080c6238fd07a5ba12b232a7cf93e8376b0f58072e1504810a84b4"} Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.351106 4728 scope.go:117] "RemoveContainer" containerID="786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.351393 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d9964" podUID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerName="registry-server" containerID="cri-o://ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1" gracePeriod=2 Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.389254 4728 scope.go:117] "RemoveContainer" containerID="e8c21379e650340d234847917df7b0fe4a855d563648b9b02566ec3d2f769263" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.394976 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-567nw"] Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.404376 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-567nw"] Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.435495 4728 scope.go:117] "RemoveContainer" containerID="e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.519361 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" path="/var/lib/kubelet/pods/2bc49956-67ef-44f4-aaa0-5fd7560fd5df/volumes" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.566858 4728 scope.go:117] "RemoveContainer" containerID="786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263" Dec 16 15:22:21 crc kubenswrapper[4728]: E1216 15:22:21.567428 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263\": container with ID starting with 786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263 not found: ID does not exist" containerID="786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.567458 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263"} err="failed to get container status \"786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263\": rpc error: code = NotFound desc = could not find container \"786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263\": container with ID starting with 786fef73ebdd403dd0a136272db08622bfed69f50d3382d4b4881ca0a465d263 not found: ID does not exist" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.567482 4728 scope.go:117] "RemoveContainer" containerID="e8c21379e650340d234847917df7b0fe4a855d563648b9b02566ec3d2f769263" Dec 16 15:22:21 crc kubenswrapper[4728]: E1216 15:22:21.567823 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8c21379e650340d234847917df7b0fe4a855d563648b9b02566ec3d2f769263\": container with ID starting with e8c21379e650340d234847917df7b0fe4a855d563648b9b02566ec3d2f769263 not found: ID does not exist" containerID="e8c21379e650340d234847917df7b0fe4a855d563648b9b02566ec3d2f769263" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.567843 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c21379e650340d234847917df7b0fe4a855d563648b9b02566ec3d2f769263"} err="failed to get container status \"e8c21379e650340d234847917df7b0fe4a855d563648b9b02566ec3d2f769263\": rpc error: code = NotFound desc = could not find container \"e8c21379e650340d234847917df7b0fe4a855d563648b9b02566ec3d2f769263\": container with ID starting with e8c21379e650340d234847917df7b0fe4a855d563648b9b02566ec3d2f769263 not found: ID does not exist" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.567858 4728 scope.go:117] "RemoveContainer" containerID="e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48" Dec 16 15:22:21 crc kubenswrapper[4728]: E1216 15:22:21.568131 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48\": container with ID starting with e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48 not found: ID does not exist" containerID="e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.568150 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48"} err="failed to get container status \"e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48\": rpc error: code = NotFound desc = could not find container \"e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48\": container with ID starting with e230371b00b916cc6cc357c0d0ef8b3302a66ffee325921c744aadd4580d2d48 not found: ID does not exist" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.711293 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.801867 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-catalog-content\") pod \"bec595b1-e989-418f-9f33-5ad1d1112e7c\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.802114 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4qzj\" (UniqueName: \"kubernetes.io/projected/bec595b1-e989-418f-9f33-5ad1d1112e7c-kube-api-access-z4qzj\") pod \"bec595b1-e989-418f-9f33-5ad1d1112e7c\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.802140 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-utilities\") pod \"bec595b1-e989-418f-9f33-5ad1d1112e7c\" (UID: \"bec595b1-e989-418f-9f33-5ad1d1112e7c\") " Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.803246 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-utilities" (OuterVolumeSpecName: "utilities") pod "bec595b1-e989-418f-9f33-5ad1d1112e7c" (UID: "bec595b1-e989-418f-9f33-5ad1d1112e7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.810839 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec595b1-e989-418f-9f33-5ad1d1112e7c-kube-api-access-z4qzj" (OuterVolumeSpecName: "kube-api-access-z4qzj") pod "bec595b1-e989-418f-9f33-5ad1d1112e7c" (UID: "bec595b1-e989-418f-9f33-5ad1d1112e7c"). InnerVolumeSpecName "kube-api-access-z4qzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.903734 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4qzj\" (UniqueName: \"kubernetes.io/projected/bec595b1-e989-418f-9f33-5ad1d1112e7c-kube-api-access-z4qzj\") on node \"crc\" DevicePath \"\"" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.903769 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:22:21 crc kubenswrapper[4728]: I1216 15:22:21.949174 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bec595b1-e989-418f-9f33-5ad1d1112e7c" (UID: "bec595b1-e989-418f-9f33-5ad1d1112e7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.006342 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec595b1-e989-418f-9f33-5ad1d1112e7c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.367068 4728 generic.go:334] "Generic (PLEG): container finished" podID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerID="ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1" exitCode=0 Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.367117 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9964" event={"ID":"bec595b1-e989-418f-9f33-5ad1d1112e7c","Type":"ContainerDied","Data":"ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1"} Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.367193 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9964" event={"ID":"bec595b1-e989-418f-9f33-5ad1d1112e7c","Type":"ContainerDied","Data":"e538c8b5d35941d14f33dc1f561366799a9e4cf0fb8b3c9cd8a30d7501cf286e"} Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.367197 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9964" Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.367241 4728 scope.go:117] "RemoveContainer" containerID="ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1" Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.399148 4728 scope.go:117] "RemoveContainer" containerID="68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d" Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.438829 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d9964"] Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.447265 4728 scope.go:117] "RemoveContainer" containerID="7a83de0c9a9501af949121afeb1a5e6ece06169a4272d3215eea701c006242a5" Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.451420 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d9964"] Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.497159 4728 scope.go:117] "RemoveContainer" containerID="ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1" Dec 16 15:22:22 crc kubenswrapper[4728]: E1216 15:22:22.497754 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1\": container with ID starting with ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1 not found: ID does not exist" containerID="ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1" Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.497801 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1"} err="failed to get container status \"ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1\": rpc error: code = NotFound desc = could not find container \"ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1\": container with ID starting with ff8968641d194056112a4427dd4ee999bbe53d54c996b3f9325b7aff592c0ac1 not found: ID does not exist" Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.497832 4728 scope.go:117] "RemoveContainer" containerID="68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d" Dec 16 15:22:22 crc kubenswrapper[4728]: E1216 15:22:22.498295 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d\": container with ID starting with 68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d not found: ID does not exist" containerID="68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d" Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.498330 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d"} err="failed to get container status \"68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d\": rpc error: code = NotFound desc = could not find container \"68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d\": container with ID starting with 68eb3ea21c5a3e16ce7d74aab38f834198879360ab890be927adcf4b86718f9d not found: ID does not exist" Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.498360 4728 scope.go:117] "RemoveContainer" containerID="7a83de0c9a9501af949121afeb1a5e6ece06169a4272d3215eea701c006242a5" Dec 16 15:22:22 crc kubenswrapper[4728]: E1216 15:22:22.498677 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a83de0c9a9501af949121afeb1a5e6ece06169a4272d3215eea701c006242a5\": container with ID starting with 7a83de0c9a9501af949121afeb1a5e6ece06169a4272d3215eea701c006242a5 not found: ID does not exist" containerID="7a83de0c9a9501af949121afeb1a5e6ece06169a4272d3215eea701c006242a5" Dec 16 15:22:22 crc kubenswrapper[4728]: I1216 15:22:22.498720 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a83de0c9a9501af949121afeb1a5e6ece06169a4272d3215eea701c006242a5"} err="failed to get container status \"7a83de0c9a9501af949121afeb1a5e6ece06169a4272d3215eea701c006242a5\": rpc error: code = NotFound desc = could not find container \"7a83de0c9a9501af949121afeb1a5e6ece06169a4272d3215eea701c006242a5\": container with ID starting with 7a83de0c9a9501af949121afeb1a5e6ece06169a4272d3215eea701c006242a5 not found: ID does not exist" Dec 16 15:22:23 crc kubenswrapper[4728]: I1216 15:22:23.520147 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec595b1-e989-418f-9f33-5ad1d1112e7c" path="/var/lib/kubelet/pods/bec595b1-e989-418f-9f33-5ad1d1112e7c/volumes" Dec 16 15:22:36 crc kubenswrapper[4728]: I1216 15:22:36.513594 4728 generic.go:334] "Generic (PLEG): container finished" podID="801eb0fd-312d-4913-8608-52baf1c65fea" containerID="676bb11b46c32bc4ef34665732820321bb6c8ac1a4afbb0549a568300edff153" exitCode=0 Dec 16 15:22:36 crc kubenswrapper[4728]: I1216 15:22:36.513804 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" event={"ID":"801eb0fd-312d-4913-8608-52baf1c65fea","Type":"ContainerDied","Data":"676bb11b46c32bc4ef34665732820321bb6c8ac1a4afbb0549a568300edff153"} Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.037631 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.146493 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45rkb\" (UniqueName: \"kubernetes.io/projected/801eb0fd-312d-4913-8608-52baf1c65fea-kube-api-access-45rkb\") pod \"801eb0fd-312d-4913-8608-52baf1c65fea\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.146556 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-inventory\") pod \"801eb0fd-312d-4913-8608-52baf1c65fea\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.146745 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-ssh-key\") pod \"801eb0fd-312d-4913-8608-52baf1c65fea\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.146883 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-bootstrap-combined-ca-bundle\") pod \"801eb0fd-312d-4913-8608-52baf1c65fea\" (UID: \"801eb0fd-312d-4913-8608-52baf1c65fea\") " Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.153279 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801eb0fd-312d-4913-8608-52baf1c65fea-kube-api-access-45rkb" (OuterVolumeSpecName: "kube-api-access-45rkb") pod "801eb0fd-312d-4913-8608-52baf1c65fea" (UID: "801eb0fd-312d-4913-8608-52baf1c65fea"). InnerVolumeSpecName "kube-api-access-45rkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.155630 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "801eb0fd-312d-4913-8608-52baf1c65fea" (UID: "801eb0fd-312d-4913-8608-52baf1c65fea"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.180370 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "801eb0fd-312d-4913-8608-52baf1c65fea" (UID: "801eb0fd-312d-4913-8608-52baf1c65fea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.198216 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-inventory" (OuterVolumeSpecName: "inventory") pod "801eb0fd-312d-4913-8608-52baf1c65fea" (UID: "801eb0fd-312d-4913-8608-52baf1c65fea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.249660 4728 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.249724 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45rkb\" (UniqueName: \"kubernetes.io/projected/801eb0fd-312d-4913-8608-52baf1c65fea-kube-api-access-45rkb\") on node \"crc\" DevicePath \"\"" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.249744 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.249763 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/801eb0fd-312d-4913-8608-52baf1c65fea-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.541292 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" event={"ID":"801eb0fd-312d-4913-8608-52baf1c65fea","Type":"ContainerDied","Data":"c2b932493ed3f0afef535eeb360515088298028b54871ce7138ae809f66705c8"} Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.541357 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2b932493ed3f0afef535eeb360515088298028b54871ce7138ae809f66705c8" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.541437 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.707653 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2"] Dec 16 15:22:38 crc kubenswrapper[4728]: E1216 15:22:38.708501 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801eb0fd-312d-4913-8608-52baf1c65fea" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.708535 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="801eb0fd-312d-4913-8608-52baf1c65fea" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 16 15:22:38 crc kubenswrapper[4728]: E1216 15:22:38.708574 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerName="extract-utilities" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.708589 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerName="extract-utilities" Dec 16 15:22:38 crc kubenswrapper[4728]: E1216 15:22:38.708622 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" containerName="extract-utilities" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.708635 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" containerName="extract-utilities" Dec 16 15:22:38 crc kubenswrapper[4728]: E1216 15:22:38.708655 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerName="extract-content" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.708669 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerName="extract-content" Dec 16 15:22:38 crc kubenswrapper[4728]: E1216 15:22:38.708692 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" containerName="extract-content" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.708705 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" containerName="extract-content" Dec 16 15:22:38 crc kubenswrapper[4728]: E1216 15:22:38.708734 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerName="registry-server" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.708747 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerName="registry-server" Dec 16 15:22:38 crc kubenswrapper[4728]: E1216 15:22:38.708774 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" containerName="registry-server" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.708788 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" containerName="registry-server" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.709272 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc49956-67ef-44f4-aaa0-5fd7560fd5df" containerName="registry-server" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.709325 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec595b1-e989-418f-9f33-5ad1d1112e7c" containerName="registry-server" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.709352 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="801eb0fd-312d-4913-8608-52baf1c65fea" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.710433 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.713013 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.713671 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.713889 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.714099 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.726956 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2"] Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.875466 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfldc\" (UniqueName: \"kubernetes.io/projected/26b6262a-41a3-48c4-aba9-a54801be0a7c-kube-api-access-rfldc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwww2\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.875931 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwww2\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.876005 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwww2\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.978728 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwww2\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.978891 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwww2\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.979080 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfldc\" (UniqueName: \"kubernetes.io/projected/26b6262a-41a3-48c4-aba9-a54801be0a7c-kube-api-access-rfldc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwww2\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.985811 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwww2\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:22:38 crc kubenswrapper[4728]: I1216 15:22:38.986087 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwww2\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:22:39 crc kubenswrapper[4728]: I1216 15:22:39.020464 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfldc\" (UniqueName: \"kubernetes.io/projected/26b6262a-41a3-48c4-aba9-a54801be0a7c-kube-api-access-rfldc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwww2\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:22:39 crc kubenswrapper[4728]: I1216 15:22:39.045618 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:22:39 crc kubenswrapper[4728]: I1216 15:22:39.652843 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2"] Dec 16 15:22:40 crc kubenswrapper[4728]: I1216 15:22:40.580318 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" event={"ID":"26b6262a-41a3-48c4-aba9-a54801be0a7c","Type":"ContainerStarted","Data":"ad638f90dc2feac31780ce8298bb6527593b59a6c36e4115ad1c58e9f691a06e"} Dec 16 15:22:42 crc kubenswrapper[4728]: I1216 15:22:42.603687 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" event={"ID":"26b6262a-41a3-48c4-aba9-a54801be0a7c","Type":"ContainerStarted","Data":"8f0fe35fae588f7629458e16691c089df77f121e68570a82699fe77a99d4c861"} Dec 16 15:22:42 crc kubenswrapper[4728]: I1216 15:22:42.628329 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" podStartSLOduration=2.806646973 podStartE2EDuration="4.628298719s" podCreationTimestamp="2025-12-16 15:22:38 +0000 UTC" firstStartedPulling="2025-12-16 15:22:39.667063726 +0000 UTC m=+1540.507242720" lastFinishedPulling="2025-12-16 15:22:41.488715482 +0000 UTC m=+1542.328894466" observedRunningTime="2025-12-16 15:22:42.626254154 +0000 UTC m=+1543.466433208" watchObservedRunningTime="2025-12-16 15:22:42.628298719 +0000 UTC m=+1543.468477743" Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.064340 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bn62h"] Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.077617 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-657w4"] Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.092128 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-dc7b-account-create-update-hk9vg"] Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.103314 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bn62h"] Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.112801 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-dc7b-account-create-update-hk9vg"] Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.120017 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-657w4"] Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.127009 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-51a7-account-create-update-l9t68"] Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.134542 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-51a7-account-create-update-l9t68"] Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.520471 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a013edbf-b6b3-46ab-b13e-a27d3ddab2c4" path="/var/lib/kubelet/pods/a013edbf-b6b3-46ab-b13e-a27d3ddab2c4/volumes" Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.521656 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0214724-b0c1-40f7-b086-6fea171a8500" path="/var/lib/kubelet/pods/e0214724-b0c1-40f7-b086-6fea171a8500/volumes" Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.522770 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e422ae6c-3605-4278-93aa-116a092e1f95" path="/var/lib/kubelet/pods/e422ae6c-3605-4278-93aa-116a092e1f95/volumes" Dec 16 15:23:23 crc kubenswrapper[4728]: I1216 15:23:23.523870 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6444b69-7cc1-4cbd-a266-00a9f064d649" path="/var/lib/kubelet/pods/f6444b69-7cc1-4cbd-a266-00a9f064d649/volumes" Dec 16 15:23:24 crc kubenswrapper[4728]: I1216 15:23:24.031883 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2c3e-account-create-update-smqdm"] Dec 16 15:23:24 crc kubenswrapper[4728]: I1216 15:23:24.039713 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-qvsjj"] Dec 16 15:23:24 crc kubenswrapper[4728]: I1216 15:23:24.047748 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2c3e-account-create-update-smqdm"] Dec 16 15:23:24 crc kubenswrapper[4728]: I1216 15:23:24.056093 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-qvsjj"] Dec 16 15:23:25 crc kubenswrapper[4728]: I1216 15:23:25.520683 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b0ce26-d151-49ef-af8e-8ca42ffe3944" path="/var/lib/kubelet/pods/23b0ce26-d151-49ef-af8e-8ca42ffe3944/volumes" Dec 16 15:23:25 crc kubenswrapper[4728]: I1216 15:23:25.521377 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="923a5238-0877-49bc-8b92-37cab936f43f" path="/var/lib/kubelet/pods/923a5238-0877-49bc-8b92-37cab936f43f/volumes" Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.059356 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-r77w5"] Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.073161 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-t2ct9"] Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.084644 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-sd955"] Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.099628 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-85fc-account-create-update-6fm7d"] Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.104088 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-dce7-account-create-update-56dlj"] Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.112976 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fb5d-account-create-update-vx5xh"] Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.121264 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-t2ct9"] Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.132302 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-r77w5"] Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.144364 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-sd955"] Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.155298 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fb5d-account-create-update-vx5xh"] Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.164298 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-dce7-account-create-update-56dlj"] Dec 16 15:23:56 crc kubenswrapper[4728]: I1216 15:23:56.174215 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-85fc-account-create-update-6fm7d"] Dec 16 15:23:57 crc kubenswrapper[4728]: I1216 15:23:57.521372 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a1be12-1801-4429-90e9-120ecaa41788" path="/var/lib/kubelet/pods/02a1be12-1801-4429-90e9-120ecaa41788/volumes" Dec 16 15:23:57 crc kubenswrapper[4728]: I1216 15:23:57.523216 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d4ab17-9896-4998-a391-38740aabe347" path="/var/lib/kubelet/pods/54d4ab17-9896-4998-a391-38740aabe347/volumes" Dec 16 15:23:57 crc kubenswrapper[4728]: I1216 15:23:57.524577 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f" path="/var/lib/kubelet/pods/78ac9e6f-45c0-4736-8452-d7cdc4ee2d8f/volumes" Dec 16 15:23:57 crc kubenswrapper[4728]: I1216 15:23:57.525802 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c5fa45f-6924-4aca-b07f-f7a26af9ae1e" path="/var/lib/kubelet/pods/7c5fa45f-6924-4aca-b07f-f7a26af9ae1e/volumes" Dec 16 15:23:57 crc kubenswrapper[4728]: I1216 15:23:57.528073 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4129cf7-bc79-4961-aed2-ff704fd4c29e" path="/var/lib/kubelet/pods/b4129cf7-bc79-4961-aed2-ff704fd4c29e/volumes" Dec 16 15:23:57 crc kubenswrapper[4728]: I1216 15:23:57.529256 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da29f3ca-e4e7-4f01-9dfa-315a928c25c3" path="/var/lib/kubelet/pods/da29f3ca-e4e7-4f01-9dfa-315a928c25c3/volumes" Dec 16 15:24:03 crc kubenswrapper[4728]: I1216 15:24:03.051366 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2lb8c"] Dec 16 15:24:03 crc kubenswrapper[4728]: I1216 15:24:03.063602 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2lb8c"] Dec 16 15:24:03 crc kubenswrapper[4728]: I1216 15:24:03.523058 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f19c009d-8b36-4b96-9995-541e097b4f21" path="/var/lib/kubelet/pods/f19c009d-8b36-4b96-9995-541e097b4f21/volumes" Dec 16 15:24:08 crc kubenswrapper[4728]: I1216 15:24:08.818491 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:24:08 crc kubenswrapper[4728]: I1216 15:24:08.820432 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.199598 4728 scope.go:117] "RemoveContainer" containerID="57cbacc438f5b3f2c03d8d28bd12b246ef6596f07422e1d8d0db89b9d552c8cc" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.246153 4728 scope.go:117] "RemoveContainer" containerID="68382ced612779b6fe24b054e8b1326f83ad78df158f5128fb0b837e36979e90" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.312795 4728 scope.go:117] "RemoveContainer" containerID="7c48a120c584a8485027a015d3c0a95f2354bd4cc842712c235d869ea07a58f5" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.360086 4728 scope.go:117] "RemoveContainer" containerID="21f7766c39bdd705e0930000b54ad8e4bd553d9ddfe515e2edacec61d691730f" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.396692 4728 scope.go:117] "RemoveContainer" containerID="95664df227752c69c91d23a20ba4bbb07f3ee4ae02130a06abafc0bfeffa4702" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.434291 4728 scope.go:117] "RemoveContainer" containerID="94a15d80aa5b91748fc1a632d6d94a173c4f29ac38a30b09c7e1bc8e2f3f1e89" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.497775 4728 scope.go:117] "RemoveContainer" containerID="2b79213d8f79d2599b4b96667c88887812fa01ba79424fd6e8768af991ad3dc7" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.551772 4728 scope.go:117] "RemoveContainer" containerID="77c340523d1a2a994992f73c317da8c4d736bf12a1ed83d0bf7c952fcbf52056" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.581341 4728 scope.go:117] "RemoveContainer" containerID="fcb5e84498f3a1fc8efe49f0c84566a9af16398357e3de61decf5a732c4cd70b" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.626453 4728 scope.go:117] "RemoveContainer" containerID="80222ff207508a2f4cc058819bc0f6e3e14fa71a35b3a50c53731cbd8eef99b3" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.667362 4728 scope.go:117] "RemoveContainer" containerID="a66f0e0506808fcea0552504cc4edd82c8d81cb27c928d490cd7f70e6adf5538" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.699582 4728 scope.go:117] "RemoveContainer" containerID="7d312bb8d2de0d5f58e3774229fc478c86b781322562756eb684719c73a34bd7" Dec 16 15:24:16 crc kubenswrapper[4728]: I1216 15:24:16.735891 4728 scope.go:117] "RemoveContainer" containerID="954de75c25c3e8b73e274320be59e5cba4d0b6b2868b9871771c32edc5076599" Dec 16 15:24:20 crc kubenswrapper[4728]: I1216 15:24:20.986972 4728 generic.go:334] "Generic (PLEG): container finished" podID="26b6262a-41a3-48c4-aba9-a54801be0a7c" containerID="8f0fe35fae588f7629458e16691c089df77f121e68570a82699fe77a99d4c861" exitCode=0 Dec 16 15:24:20 crc kubenswrapper[4728]: I1216 15:24:20.987107 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" event={"ID":"26b6262a-41a3-48c4-aba9-a54801be0a7c","Type":"ContainerDied","Data":"8f0fe35fae588f7629458e16691c089df77f121e68570a82699fe77a99d4c861"} Dec 16 15:24:22 crc kubenswrapper[4728]: I1216 15:24:22.508117 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:24:22 crc kubenswrapper[4728]: I1216 15:24:22.671229 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-inventory\") pod \"26b6262a-41a3-48c4-aba9-a54801be0a7c\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " Dec 16 15:24:22 crc kubenswrapper[4728]: I1216 15:24:22.671302 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfldc\" (UniqueName: \"kubernetes.io/projected/26b6262a-41a3-48c4-aba9-a54801be0a7c-kube-api-access-rfldc\") pod \"26b6262a-41a3-48c4-aba9-a54801be0a7c\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " Dec 16 15:24:22 crc kubenswrapper[4728]: I1216 15:24:22.671440 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-ssh-key\") pod \"26b6262a-41a3-48c4-aba9-a54801be0a7c\" (UID: \"26b6262a-41a3-48c4-aba9-a54801be0a7c\") " Dec 16 15:24:22 crc kubenswrapper[4728]: I1216 15:24:22.682060 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26b6262a-41a3-48c4-aba9-a54801be0a7c-kube-api-access-rfldc" (OuterVolumeSpecName: "kube-api-access-rfldc") pod "26b6262a-41a3-48c4-aba9-a54801be0a7c" (UID: "26b6262a-41a3-48c4-aba9-a54801be0a7c"). InnerVolumeSpecName "kube-api-access-rfldc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:24:22 crc kubenswrapper[4728]: I1216 15:24:22.722076 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "26b6262a-41a3-48c4-aba9-a54801be0a7c" (UID: "26b6262a-41a3-48c4-aba9-a54801be0a7c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:24:22 crc kubenswrapper[4728]: I1216 15:24:22.722899 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-inventory" (OuterVolumeSpecName: "inventory") pod "26b6262a-41a3-48c4-aba9-a54801be0a7c" (UID: "26b6262a-41a3-48c4-aba9-a54801be0a7c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:24:22 crc kubenswrapper[4728]: I1216 15:24:22.774088 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:24:22 crc kubenswrapper[4728]: I1216 15:24:22.774120 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfldc\" (UniqueName: \"kubernetes.io/projected/26b6262a-41a3-48c4-aba9-a54801be0a7c-kube-api-access-rfldc\") on node \"crc\" DevicePath \"\"" Dec 16 15:24:22 crc kubenswrapper[4728]: I1216 15:24:22.774130 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26b6262a-41a3-48c4-aba9-a54801be0a7c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.012061 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" event={"ID":"26b6262a-41a3-48c4-aba9-a54801be0a7c","Type":"ContainerDied","Data":"ad638f90dc2feac31780ce8298bb6527593b59a6c36e4115ad1c58e9f691a06e"} Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.012464 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad638f90dc2feac31780ce8298bb6527593b59a6c36e4115ad1c58e9f691a06e" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.012171 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwww2" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.115205 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5"] Dec 16 15:24:23 crc kubenswrapper[4728]: E1216 15:24:23.115983 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b6262a-41a3-48c4-aba9-a54801be0a7c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.116007 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b6262a-41a3-48c4-aba9-a54801be0a7c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.116269 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b6262a-41a3-48c4-aba9-a54801be0a7c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.117269 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.119286 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.119909 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.120046 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.121186 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.128227 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5"] Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.183524 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.183582 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.183612 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpvs6\" (UniqueName: \"kubernetes.io/projected/81acd27c-46ac-4132-9e15-6858289dbb7b-kube-api-access-xpvs6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.285091 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.285235 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.285313 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpvs6\" (UniqueName: \"kubernetes.io/projected/81acd27c-46ac-4132-9e15-6858289dbb7b-kube-api-access-xpvs6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.292014 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.297585 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.314542 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpvs6\" (UniqueName: \"kubernetes.io/projected/81acd27c-46ac-4132-9e15-6858289dbb7b-kube-api-access-xpvs6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:24:23 crc kubenswrapper[4728]: I1216 15:24:23.433766 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:24:24 crc kubenswrapper[4728]: I1216 15:24:24.021949 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5"] Dec 16 15:24:25 crc kubenswrapper[4728]: I1216 15:24:25.040990 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" event={"ID":"81acd27c-46ac-4132-9e15-6858289dbb7b","Type":"ContainerStarted","Data":"9c9b10623d78748ec540ff23163b5bda0c94780cdd21e02a938889a8e3254ef7"} Dec 16 15:24:27 crc kubenswrapper[4728]: I1216 15:24:27.060634 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" event={"ID":"81acd27c-46ac-4132-9e15-6858289dbb7b","Type":"ContainerStarted","Data":"48ccb55df3d25dc6bcfdac24d15b2ad6dbafef4ed4dbf60d50d57dc0499e0814"} Dec 16 15:24:27 crc kubenswrapper[4728]: I1216 15:24:27.092098 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" podStartSLOduration=2.061481955 podStartE2EDuration="4.092074205s" podCreationTimestamp="2025-12-16 15:24:23 +0000 UTC" firstStartedPulling="2025-12-16 15:24:24.023578537 +0000 UTC m=+1644.863757521" lastFinishedPulling="2025-12-16 15:24:26.054170767 +0000 UTC m=+1646.894349771" observedRunningTime="2025-12-16 15:24:27.083096549 +0000 UTC m=+1647.923275563" watchObservedRunningTime="2025-12-16 15:24:27.092074205 +0000 UTC m=+1647.932253209" Dec 16 15:24:34 crc kubenswrapper[4728]: I1216 15:24:34.070644 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-tnpp2"] Dec 16 15:24:34 crc kubenswrapper[4728]: I1216 15:24:34.086244 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-tnpp2"] Dec 16 15:24:35 crc kubenswrapper[4728]: I1216 15:24:35.528814 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316025cd-8999-4601-a3df-4aaf1dad3a83" path="/var/lib/kubelet/pods/316025cd-8999-4601-a3df-4aaf1dad3a83/volumes" Dec 16 15:24:38 crc kubenswrapper[4728]: I1216 15:24:38.819395 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:24:38 crc kubenswrapper[4728]: I1216 15:24:38.819855 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:24:44 crc kubenswrapper[4728]: I1216 15:24:44.069496 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mwdss"] Dec 16 15:24:44 crc kubenswrapper[4728]: I1216 15:24:44.086574 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mwdss"] Dec 16 15:24:45 crc kubenswrapper[4728]: I1216 15:24:45.525302 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332" path="/var/lib/kubelet/pods/f7a8fd42-3ed7-4db4-ac90-ffc17d6b7332/volumes" Dec 16 15:24:47 crc kubenswrapper[4728]: I1216 15:24:47.042273 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gzgrb"] Dec 16 15:24:47 crc kubenswrapper[4728]: I1216 15:24:47.053206 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gzgrb"] Dec 16 15:24:47 crc kubenswrapper[4728]: I1216 15:24:47.530209 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e129cb-0ce5-4289-a50b-2513ab8ba750" path="/var/lib/kubelet/pods/60e129cb-0ce5-4289-a50b-2513ab8ba750/volumes" Dec 16 15:25:08 crc kubenswrapper[4728]: I1216 15:25:08.819007 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:25:08 crc kubenswrapper[4728]: I1216 15:25:08.819663 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:25:08 crc kubenswrapper[4728]: I1216 15:25:08.819703 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:25:08 crc kubenswrapper[4728]: I1216 15:25:08.820456 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:25:08 crc kubenswrapper[4728]: I1216 15:25:08.820503 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" gracePeriod=600 Dec 16 15:25:08 crc kubenswrapper[4728]: E1216 15:25:08.947998 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:25:09 crc kubenswrapper[4728]: I1216 15:25:09.552200 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" exitCode=0 Dec 16 15:25:09 crc kubenswrapper[4728]: I1216 15:25:09.552313 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6"} Dec 16 15:25:09 crc kubenswrapper[4728]: I1216 15:25:09.552687 4728 scope.go:117] "RemoveContainer" containerID="cf4555b97afbd3b3d2de44b030a2e6b901aec1b1c9811cdabf788a725a3bd7ca" Dec 16 15:25:09 crc kubenswrapper[4728]: I1216 15:25:09.553567 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:25:09 crc kubenswrapper[4728]: E1216 15:25:09.553941 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:25:15 crc kubenswrapper[4728]: I1216 15:25:15.043018 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wcv69"] Dec 16 15:25:15 crc kubenswrapper[4728]: I1216 15:25:15.051578 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wcv69"] Dec 16 15:25:15 crc kubenswrapper[4728]: I1216 15:25:15.060511 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xfxvz"] Dec 16 15:25:15 crc kubenswrapper[4728]: I1216 15:25:15.068619 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xfxvz"] Dec 16 15:25:15 crc kubenswrapper[4728]: I1216 15:25:15.519594 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04fe707b-a597-4768-8190-6efb7aea9faa" path="/var/lib/kubelet/pods/04fe707b-a597-4768-8190-6efb7aea9faa/volumes" Dec 16 15:25:15 crc kubenswrapper[4728]: I1216 15:25:15.520525 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8cfd92c-8ec9-4d81-a119-2c35893fba2b" path="/var/lib/kubelet/pods/d8cfd92c-8ec9-4d81-a119-2c35893fba2b/volumes" Dec 16 15:25:16 crc kubenswrapper[4728]: I1216 15:25:16.983047 4728 scope.go:117] "RemoveContainer" containerID="c0977b96e79053722108b55a7d914b08ea2818872b7ad4424a80507ef69b89f8" Dec 16 15:25:17 crc kubenswrapper[4728]: I1216 15:25:17.023329 4728 scope.go:117] "RemoveContainer" containerID="f567ac43ddaa2e5b598e1f0c130271c009569705fae00d3d3d8098c1a09fd023" Dec 16 15:25:17 crc kubenswrapper[4728]: I1216 15:25:17.118463 4728 scope.go:117] "RemoveContainer" containerID="330bb68c46623995b5939a30a6e76c4843fed254b676083b2d151a5ad3c2433e" Dec 16 15:25:17 crc kubenswrapper[4728]: I1216 15:25:17.225451 4728 scope.go:117] "RemoveContainer" containerID="29617b51e7b854b65239789ae0e78fedf8a5f8ed2142edf34a344ed2782a1b0b" Dec 16 15:25:17 crc kubenswrapper[4728]: I1216 15:25:17.280572 4728 scope.go:117] "RemoveContainer" containerID="2785f5649eefb93fed84b8482c968f3ccd82dd418dc4a4324c25b8395214e30a" Dec 16 15:25:20 crc kubenswrapper[4728]: I1216 15:25:20.506876 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:25:20 crc kubenswrapper[4728]: E1216 15:25:20.507782 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:25:33 crc kubenswrapper[4728]: I1216 15:25:33.507388 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:25:33 crc kubenswrapper[4728]: E1216 15:25:33.508585 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:25:34 crc kubenswrapper[4728]: I1216 15:25:34.832661 4728 generic.go:334] "Generic (PLEG): container finished" podID="81acd27c-46ac-4132-9e15-6858289dbb7b" containerID="48ccb55df3d25dc6bcfdac24d15b2ad6dbafef4ed4dbf60d50d57dc0499e0814" exitCode=0 Dec 16 15:25:34 crc kubenswrapper[4728]: I1216 15:25:34.832731 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" event={"ID":"81acd27c-46ac-4132-9e15-6858289dbb7b","Type":"ContainerDied","Data":"48ccb55df3d25dc6bcfdac24d15b2ad6dbafef4ed4dbf60d50d57dc0499e0814"} Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.247519 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.405861 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-inventory\") pod \"81acd27c-46ac-4132-9e15-6858289dbb7b\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.406252 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpvs6\" (UniqueName: \"kubernetes.io/projected/81acd27c-46ac-4132-9e15-6858289dbb7b-kube-api-access-xpvs6\") pod \"81acd27c-46ac-4132-9e15-6858289dbb7b\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.406288 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-ssh-key\") pod \"81acd27c-46ac-4132-9e15-6858289dbb7b\" (UID: \"81acd27c-46ac-4132-9e15-6858289dbb7b\") " Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.411115 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81acd27c-46ac-4132-9e15-6858289dbb7b-kube-api-access-xpvs6" (OuterVolumeSpecName: "kube-api-access-xpvs6") pod "81acd27c-46ac-4132-9e15-6858289dbb7b" (UID: "81acd27c-46ac-4132-9e15-6858289dbb7b"). InnerVolumeSpecName "kube-api-access-xpvs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.431968 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-inventory" (OuterVolumeSpecName: "inventory") pod "81acd27c-46ac-4132-9e15-6858289dbb7b" (UID: "81acd27c-46ac-4132-9e15-6858289dbb7b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.435292 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "81acd27c-46ac-4132-9e15-6858289dbb7b" (UID: "81acd27c-46ac-4132-9e15-6858289dbb7b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.509047 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.509080 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpvs6\" (UniqueName: \"kubernetes.io/projected/81acd27c-46ac-4132-9e15-6858289dbb7b-kube-api-access-xpvs6\") on node \"crc\" DevicePath \"\"" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.509090 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81acd27c-46ac-4132-9e15-6858289dbb7b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.852571 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" event={"ID":"81acd27c-46ac-4132-9e15-6858289dbb7b","Type":"ContainerDied","Data":"9c9b10623d78748ec540ff23163b5bda0c94780cdd21e02a938889a8e3254ef7"} Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.852611 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c9b10623d78748ec540ff23163b5bda0c94780cdd21e02a938889a8e3254ef7" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.852666 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.941566 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns"] Dec 16 15:25:36 crc kubenswrapper[4728]: E1216 15:25:36.942469 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81acd27c-46ac-4132-9e15-6858289dbb7b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.942562 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="81acd27c-46ac-4132-9e15-6858289dbb7b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.942868 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="81acd27c-46ac-4132-9e15-6858289dbb7b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.943752 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.946689 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.953180 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.953275 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.953550 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:25:36 crc kubenswrapper[4728]: I1216 15:25:36.956201 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns"] Dec 16 15:25:37 crc kubenswrapper[4728]: I1216 15:25:37.121830 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:37 crc kubenswrapper[4728]: I1216 15:25:37.121927 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:37 crc kubenswrapper[4728]: I1216 15:25:37.122036 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mbd9\" (UniqueName: \"kubernetes.io/projected/b642fac2-b01e-4ec8-80dc-3193414e335c-kube-api-access-5mbd9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:37 crc kubenswrapper[4728]: I1216 15:25:37.224829 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:37 crc kubenswrapper[4728]: I1216 15:25:37.225031 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:37 crc kubenswrapper[4728]: I1216 15:25:37.225114 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mbd9\" (UniqueName: \"kubernetes.io/projected/b642fac2-b01e-4ec8-80dc-3193414e335c-kube-api-access-5mbd9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:37 crc kubenswrapper[4728]: I1216 15:25:37.232799 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:37 crc kubenswrapper[4728]: I1216 15:25:37.233617 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:37 crc kubenswrapper[4728]: I1216 15:25:37.254078 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mbd9\" (UniqueName: \"kubernetes.io/projected/b642fac2-b01e-4ec8-80dc-3193414e335c-kube-api-access-5mbd9\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:37 crc kubenswrapper[4728]: I1216 15:25:37.273739 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:38 crc kubenswrapper[4728]: I1216 15:25:38.043284 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns"] Dec 16 15:25:38 crc kubenswrapper[4728]: I1216 15:25:38.870718 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" event={"ID":"b642fac2-b01e-4ec8-80dc-3193414e335c","Type":"ContainerStarted","Data":"d2d7cd29b1614078e23160dcae3b644c8674b28abe4b564c3a06fa1c5c0ae370"} Dec 16 15:25:38 crc kubenswrapper[4728]: I1216 15:25:38.871286 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" event={"ID":"b642fac2-b01e-4ec8-80dc-3193414e335c","Type":"ContainerStarted","Data":"41e5940dea214c0cec411874ca34b1875a17d44a4222b9bf6c5990f738cba0f6"} Dec 16 15:25:40 crc kubenswrapper[4728]: I1216 15:25:40.040041 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" podStartSLOduration=3.569930562 podStartE2EDuration="4.040010914s" podCreationTimestamp="2025-12-16 15:25:36 +0000 UTC" firstStartedPulling="2025-12-16 15:25:38.056972466 +0000 UTC m=+1718.897151470" lastFinishedPulling="2025-12-16 15:25:38.527052828 +0000 UTC m=+1719.367231822" observedRunningTime="2025-12-16 15:25:38.892088097 +0000 UTC m=+1719.732267091" watchObservedRunningTime="2025-12-16 15:25:40.040010914 +0000 UTC m=+1720.880189918" Dec 16 15:25:40 crc kubenswrapper[4728]: I1216 15:25:40.045545 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-gbntn"] Dec 16 15:25:40 crc kubenswrapper[4728]: I1216 15:25:40.059066 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-gbntn"] Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.036793 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6ee5-account-create-update-6k7r6"] Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.052184 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7b52-account-create-update-q2f6l"] Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.063042 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2e16-account-create-update-stnh7"] Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.076094 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6ee5-account-create-update-6k7r6"] Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.086588 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7b52-account-create-update-q2f6l"] Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.093250 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2e16-account-create-update-stnh7"] Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.100344 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-km67h"] Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.109833 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-km67h"] Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.517604 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="296e275b-fc1b-4946-a4f2-2d61fac9aff8" path="/var/lib/kubelet/pods/296e275b-fc1b-4946-a4f2-2d61fac9aff8/volumes" Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.518152 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be26ac5-6df6-4245-abaa-07c0e6fcdffd" path="/var/lib/kubelet/pods/2be26ac5-6df6-4245-abaa-07c0e6fcdffd/volumes" Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.518730 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="381f4a37-75d0-4da2-a183-875b9bc481aa" path="/var/lib/kubelet/pods/381f4a37-75d0-4da2-a183-875b9bc481aa/volumes" Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.519230 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56aab1b9-1cbf-4647-b025-581f674334d6" path="/var/lib/kubelet/pods/56aab1b9-1cbf-4647-b025-581f674334d6/volumes" Dec 16 15:25:41 crc kubenswrapper[4728]: I1216 15:25:41.520245 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="887aeac4-be08-4765-ac4c-1f2ac326d1a3" path="/var/lib/kubelet/pods/887aeac4-be08-4765-ac4c-1f2ac326d1a3/volumes" Dec 16 15:25:42 crc kubenswrapper[4728]: I1216 15:25:42.054837 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-swl7z"] Dec 16 15:25:42 crc kubenswrapper[4728]: I1216 15:25:42.067629 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-swl7z"] Dec 16 15:25:43 crc kubenswrapper[4728]: I1216 15:25:43.522487 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c269bf0a-1104-49e8-8c99-9ce6926f55c2" path="/var/lib/kubelet/pods/c269bf0a-1104-49e8-8c99-9ce6926f55c2/volumes" Dec 16 15:25:43 crc kubenswrapper[4728]: I1216 15:25:43.932348 4728 generic.go:334] "Generic (PLEG): container finished" podID="b642fac2-b01e-4ec8-80dc-3193414e335c" containerID="d2d7cd29b1614078e23160dcae3b644c8674b28abe4b564c3a06fa1c5c0ae370" exitCode=0 Dec 16 15:25:43 crc kubenswrapper[4728]: I1216 15:25:43.932451 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" event={"ID":"b642fac2-b01e-4ec8-80dc-3193414e335c","Type":"ContainerDied","Data":"d2d7cd29b1614078e23160dcae3b644c8674b28abe4b564c3a06fa1c5c0ae370"} Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.549676 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.703608 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-inventory\") pod \"b642fac2-b01e-4ec8-80dc-3193414e335c\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.703728 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mbd9\" (UniqueName: \"kubernetes.io/projected/b642fac2-b01e-4ec8-80dc-3193414e335c-kube-api-access-5mbd9\") pod \"b642fac2-b01e-4ec8-80dc-3193414e335c\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.703964 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-ssh-key\") pod \"b642fac2-b01e-4ec8-80dc-3193414e335c\" (UID: \"b642fac2-b01e-4ec8-80dc-3193414e335c\") " Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.718359 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b642fac2-b01e-4ec8-80dc-3193414e335c-kube-api-access-5mbd9" (OuterVolumeSpecName: "kube-api-access-5mbd9") pod "b642fac2-b01e-4ec8-80dc-3193414e335c" (UID: "b642fac2-b01e-4ec8-80dc-3193414e335c"). InnerVolumeSpecName "kube-api-access-5mbd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.747322 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b642fac2-b01e-4ec8-80dc-3193414e335c" (UID: "b642fac2-b01e-4ec8-80dc-3193414e335c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.761331 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-inventory" (OuterVolumeSpecName: "inventory") pod "b642fac2-b01e-4ec8-80dc-3193414e335c" (UID: "b642fac2-b01e-4ec8-80dc-3193414e335c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.805714 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.805746 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b642fac2-b01e-4ec8-80dc-3193414e335c-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.805760 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mbd9\" (UniqueName: \"kubernetes.io/projected/b642fac2-b01e-4ec8-80dc-3193414e335c-kube-api-access-5mbd9\") on node \"crc\" DevicePath \"\"" Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.955671 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" event={"ID":"b642fac2-b01e-4ec8-80dc-3193414e335c","Type":"ContainerDied","Data":"41e5940dea214c0cec411874ca34b1875a17d44a4222b9bf6c5990f738cba0f6"} Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.955741 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41e5940dea214c0cec411874ca34b1875a17d44a4222b9bf6c5990f738cba0f6" Dec 16 15:25:45 crc kubenswrapper[4728]: I1216 15:25:45.955747 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.136362 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x"] Dec 16 15:25:46 crc kubenswrapper[4728]: E1216 15:25:46.136757 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b642fac2-b01e-4ec8-80dc-3193414e335c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.136776 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b642fac2-b01e-4ec8-80dc-3193414e335c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.137005 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b642fac2-b01e-4ec8-80dc-3193414e335c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.137583 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.141397 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.141624 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.141761 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.141873 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.143138 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x"] Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.313724 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt87d\" (UniqueName: \"kubernetes.io/projected/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-kube-api-access-xt87d\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsn6x\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.313777 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsn6x\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.313871 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsn6x\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.415892 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt87d\" (UniqueName: \"kubernetes.io/projected/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-kube-api-access-xt87d\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsn6x\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.415944 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsn6x\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.416005 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsn6x\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.422198 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsn6x\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.422233 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsn6x\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.435019 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt87d\" (UniqueName: \"kubernetes.io/projected/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-kube-api-access-xt87d\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gsn6x\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.457145 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:25:46 crc kubenswrapper[4728]: I1216 15:25:46.992233 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x"] Dec 16 15:25:47 crc kubenswrapper[4728]: I1216 15:25:47.028820 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9d9zb"] Dec 16 15:25:47 crc kubenswrapper[4728]: I1216 15:25:47.037617 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9d9zb"] Dec 16 15:25:47 crc kubenswrapper[4728]: I1216 15:25:47.517456 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f82109b1-c2b6-462c-8857-d0d8b243f64a" path="/var/lib/kubelet/pods/f82109b1-c2b6-462c-8857-d0d8b243f64a/volumes" Dec 16 15:25:47 crc kubenswrapper[4728]: I1216 15:25:47.972459 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" event={"ID":"6f876743-6860-4e07-b8ed-d1cfcd92f2a7","Type":"ContainerStarted","Data":"f4205994b3e655a412677e53eb223c7c60cd433d6e1f4c5e7d8bd955640aee1f"} Dec 16 15:25:47 crc kubenswrapper[4728]: I1216 15:25:47.972805 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" event={"ID":"6f876743-6860-4e07-b8ed-d1cfcd92f2a7","Type":"ContainerStarted","Data":"f5dd2f8ae228c51d1fc01acf316acaa2f8fefd0dd85029a09217da43f6e95a4a"} Dec 16 15:25:47 crc kubenswrapper[4728]: I1216 15:25:47.997193 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" podStartSLOduration=1.447843942 podStartE2EDuration="1.997173523s" podCreationTimestamp="2025-12-16 15:25:46 +0000 UTC" firstStartedPulling="2025-12-16 15:25:46.991893397 +0000 UTC m=+1727.832072381" lastFinishedPulling="2025-12-16 15:25:47.541222958 +0000 UTC m=+1728.381401962" observedRunningTime="2025-12-16 15:25:47.988878526 +0000 UTC m=+1728.829057560" watchObservedRunningTime="2025-12-16 15:25:47.997173523 +0000 UTC m=+1728.837352517" Dec 16 15:25:48 crc kubenswrapper[4728]: I1216 15:25:48.506846 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:25:48 crc kubenswrapper[4728]: E1216 15:25:48.507138 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:26:03 crc kubenswrapper[4728]: I1216 15:26:03.506705 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:26:03 crc kubenswrapper[4728]: E1216 15:26:03.507716 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:26:16 crc kubenswrapper[4728]: I1216 15:26:16.050961 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bh899"] Dec 16 15:26:16 crc kubenswrapper[4728]: I1216 15:26:16.057969 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bh899"] Dec 16 15:26:17 crc kubenswrapper[4728]: I1216 15:26:17.431596 4728 scope.go:117] "RemoveContainer" containerID="5a15a49a8830f47089fe02053586129c89b224a320417de2d07ef1773e3b146f" Dec 16 15:26:17 crc kubenswrapper[4728]: I1216 15:26:17.467294 4728 scope.go:117] "RemoveContainer" containerID="b0337e79c24fb11aa755a2a464f61de22b18cda479f725824638dae8ee5c82f7" Dec 16 15:26:17 crc kubenswrapper[4728]: I1216 15:26:17.509806 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:26:17 crc kubenswrapper[4728]: E1216 15:26:17.510125 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:26:17 crc kubenswrapper[4728]: I1216 15:26:17.510443 4728 scope.go:117] "RemoveContainer" containerID="5f5a9d87a3391e72a4b4578e9e8a1992ea4c29ea1384a7fc382dda449a092783" Dec 16 15:26:17 crc kubenswrapper[4728]: I1216 15:26:17.525572 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d185e68-d66c-438b-b4c2-bde356e4313e" path="/var/lib/kubelet/pods/3d185e68-d66c-438b-b4c2-bde356e4313e/volumes" Dec 16 15:26:17 crc kubenswrapper[4728]: I1216 15:26:17.584142 4728 scope.go:117] "RemoveContainer" containerID="73759ef8f04f1a70c0514d308b163d73f97a9db2a7bbab324d7b4170ccd16066" Dec 16 15:26:17 crc kubenswrapper[4728]: I1216 15:26:17.626011 4728 scope.go:117] "RemoveContainer" containerID="a391b84ed8b4e1ddbf4cb68c6096ee3c1ea90f9c71f3f101d22e807bbe259f2c" Dec 16 15:26:17 crc kubenswrapper[4728]: I1216 15:26:17.672061 4728 scope.go:117] "RemoveContainer" containerID="8b33f1d15420a129b4b38864e51d2bc248a9beb033e87b07e04a0a535e65877a" Dec 16 15:26:17 crc kubenswrapper[4728]: I1216 15:26:17.706448 4728 scope.go:117] "RemoveContainer" containerID="5e1e9b3b604f67a60129366077364666dd5a725f1f9dafc89251d3e837096e8c" Dec 16 15:26:29 crc kubenswrapper[4728]: I1216 15:26:29.550210 4728 generic.go:334] "Generic (PLEG): container finished" podID="6f876743-6860-4e07-b8ed-d1cfcd92f2a7" containerID="f4205994b3e655a412677e53eb223c7c60cd433d6e1f4c5e7d8bd955640aee1f" exitCode=0 Dec 16 15:26:29 crc kubenswrapper[4728]: I1216 15:26:29.550261 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" event={"ID":"6f876743-6860-4e07-b8ed-d1cfcd92f2a7","Type":"ContainerDied","Data":"f4205994b3e655a412677e53eb223c7c60cd433d6e1f4c5e7d8bd955640aee1f"} Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.017157 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.112001 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-inventory\") pod \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.112056 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt87d\" (UniqueName: \"kubernetes.io/projected/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-kube-api-access-xt87d\") pod \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.112214 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-ssh-key\") pod \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\" (UID: \"6f876743-6860-4e07-b8ed-d1cfcd92f2a7\") " Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.117910 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-kube-api-access-xt87d" (OuterVolumeSpecName: "kube-api-access-xt87d") pod "6f876743-6860-4e07-b8ed-d1cfcd92f2a7" (UID: "6f876743-6860-4e07-b8ed-d1cfcd92f2a7"). InnerVolumeSpecName "kube-api-access-xt87d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.139212 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-inventory" (OuterVolumeSpecName: "inventory") pod "6f876743-6860-4e07-b8ed-d1cfcd92f2a7" (UID: "6f876743-6860-4e07-b8ed-d1cfcd92f2a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.145618 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6f876743-6860-4e07-b8ed-d1cfcd92f2a7" (UID: "6f876743-6860-4e07-b8ed-d1cfcd92f2a7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.235460 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.235514 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.235528 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt87d\" (UniqueName: \"kubernetes.io/projected/6f876743-6860-4e07-b8ed-d1cfcd92f2a7-kube-api-access-xt87d\") on node \"crc\" DevicePath \"\"" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.506236 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:26:31 crc kubenswrapper[4728]: E1216 15:26:31.506597 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.570699 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" event={"ID":"6f876743-6860-4e07-b8ed-d1cfcd92f2a7","Type":"ContainerDied","Data":"f5dd2f8ae228c51d1fc01acf316acaa2f8fefd0dd85029a09217da43f6e95a4a"} Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.570757 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5dd2f8ae228c51d1fc01acf316acaa2f8fefd0dd85029a09217da43f6e95a4a" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.570776 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gsn6x" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.672324 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2"] Dec 16 15:26:31 crc kubenswrapper[4728]: E1216 15:26:31.672936 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f876743-6860-4e07-b8ed-d1cfcd92f2a7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.672972 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f876743-6860-4e07-b8ed-d1cfcd92f2a7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.673396 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f876743-6860-4e07-b8ed-d1cfcd92f2a7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.676930 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.679923 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.679994 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.680216 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.680264 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.683505 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2"] Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.746107 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmcvp\" (UniqueName: \"kubernetes.io/projected/342edac6-5fe9-45d6-9d37-2bc1ed959d23-kube-api-access-rmcvp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.746535 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.746797 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.848428 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.848506 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.848597 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmcvp\" (UniqueName: \"kubernetes.io/projected/342edac6-5fe9-45d6-9d37-2bc1ed959d23-kube-api-access-rmcvp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.854967 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.858037 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:26:31 crc kubenswrapper[4728]: I1216 15:26:31.865439 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmcvp\" (UniqueName: \"kubernetes.io/projected/342edac6-5fe9-45d6-9d37-2bc1ed959d23-kube-api-access-rmcvp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:26:32 crc kubenswrapper[4728]: I1216 15:26:32.024053 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:26:32 crc kubenswrapper[4728]: I1216 15:26:32.618675 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2"] Dec 16 15:26:33 crc kubenswrapper[4728]: I1216 15:26:33.597542 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" event={"ID":"342edac6-5fe9-45d6-9d37-2bc1ed959d23","Type":"ContainerStarted","Data":"c26cb81caf63a6ae3e0af93199964f200efd01bf09ea060334ffe1af962c40a2"} Dec 16 15:26:33 crc kubenswrapper[4728]: I1216 15:26:33.598029 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" event={"ID":"342edac6-5fe9-45d6-9d37-2bc1ed959d23","Type":"ContainerStarted","Data":"52a47c9cb695fb2ea211fc57103bd8acaab497a1f09b6f74976ff703f6bc2d44"} Dec 16 15:26:33 crc kubenswrapper[4728]: I1216 15:26:33.641752 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" podStartSLOduration=2.090932054 podStartE2EDuration="2.641731885s" podCreationTimestamp="2025-12-16 15:26:31 +0000 UTC" firstStartedPulling="2025-12-16 15:26:32.635633397 +0000 UTC m=+1773.475812391" lastFinishedPulling="2025-12-16 15:26:33.186433208 +0000 UTC m=+1774.026612222" observedRunningTime="2025-12-16 15:26:33.633977194 +0000 UTC m=+1774.474156198" watchObservedRunningTime="2025-12-16 15:26:33.641731885 +0000 UTC m=+1774.481910879" Dec 16 15:26:42 crc kubenswrapper[4728]: I1216 15:26:42.059503 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-q7mks"] Dec 16 15:26:42 crc kubenswrapper[4728]: I1216 15:26:42.075451 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-q7mks"] Dec 16 15:26:43 crc kubenswrapper[4728]: I1216 15:26:43.526021 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c23498e4-9faf-4b56-9d5b-89a616514d12" path="/var/lib/kubelet/pods/c23498e4-9faf-4b56-9d5b-89a616514d12/volumes" Dec 16 15:26:44 crc kubenswrapper[4728]: I1216 15:26:44.506581 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:26:44 crc kubenswrapper[4728]: E1216 15:26:44.507394 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:26:51 crc kubenswrapper[4728]: I1216 15:26:51.030388 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fkg4z"] Dec 16 15:26:51 crc kubenswrapper[4728]: I1216 15:26:51.038054 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fkg4z"] Dec 16 15:26:51 crc kubenswrapper[4728]: I1216 15:26:51.519162 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67785338-b264-4d86-b1b5-6ca4248d938f" path="/var/lib/kubelet/pods/67785338-b264-4d86-b1b5-6ca4248d938f/volumes" Dec 16 15:26:59 crc kubenswrapper[4728]: I1216 15:26:59.518084 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:26:59 crc kubenswrapper[4728]: E1216 15:26:59.519157 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:27:14 crc kubenswrapper[4728]: I1216 15:27:14.505986 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:27:14 crc kubenswrapper[4728]: E1216 15:27:14.506799 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:27:17 crc kubenswrapper[4728]: I1216 15:27:17.872696 4728 scope.go:117] "RemoveContainer" containerID="a765450ffe387eb14fc11c7120b09befd31a586d0ce0af163b0ce45c794e0318" Dec 16 15:27:17 crc kubenswrapper[4728]: I1216 15:27:17.932523 4728 scope.go:117] "RemoveContainer" containerID="e14a33014110e81fd38c867f8181b9ccd0a7c3b800cea2ffeb1ed3977faa7a9d" Dec 16 15:27:17 crc kubenswrapper[4728]: I1216 15:27:17.977501 4728 scope.go:117] "RemoveContainer" containerID="9434c9173fb0736f4e65b34da0a27088919248ae8bcafea8cb472449cda9bf4e" Dec 16 15:27:28 crc kubenswrapper[4728]: I1216 15:27:28.046438 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-xzs88"] Dec 16 15:27:28 crc kubenswrapper[4728]: I1216 15:27:28.059572 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-xzs88"] Dec 16 15:27:29 crc kubenswrapper[4728]: I1216 15:27:29.523375 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:27:29 crc kubenswrapper[4728]: E1216 15:27:29.524688 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:27:29 crc kubenswrapper[4728]: I1216 15:27:29.526313 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e115a66a-589a-4762-b4d8-d80146360675" path="/var/lib/kubelet/pods/e115a66a-589a-4762-b4d8-d80146360675/volumes" Dec 16 15:27:32 crc kubenswrapper[4728]: I1216 15:27:32.235525 4728 generic.go:334] "Generic (PLEG): container finished" podID="342edac6-5fe9-45d6-9d37-2bc1ed959d23" containerID="c26cb81caf63a6ae3e0af93199964f200efd01bf09ea060334ffe1af962c40a2" exitCode=0 Dec 16 15:27:32 crc kubenswrapper[4728]: I1216 15:27:32.235614 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" event={"ID":"342edac6-5fe9-45d6-9d37-2bc1ed959d23","Type":"ContainerDied","Data":"c26cb81caf63a6ae3e0af93199964f200efd01bf09ea060334ffe1af962c40a2"} Dec 16 15:27:33 crc kubenswrapper[4728]: I1216 15:27:33.726261 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:27:33 crc kubenswrapper[4728]: I1216 15:27:33.856653 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-inventory\") pod \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " Dec 16 15:27:33 crc kubenswrapper[4728]: I1216 15:27:33.856841 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmcvp\" (UniqueName: \"kubernetes.io/projected/342edac6-5fe9-45d6-9d37-2bc1ed959d23-kube-api-access-rmcvp\") pod \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " Dec 16 15:27:33 crc kubenswrapper[4728]: I1216 15:27:33.856908 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-ssh-key\") pod \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\" (UID: \"342edac6-5fe9-45d6-9d37-2bc1ed959d23\") " Dec 16 15:27:33 crc kubenswrapper[4728]: I1216 15:27:33.863309 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/342edac6-5fe9-45d6-9d37-2bc1ed959d23-kube-api-access-rmcvp" (OuterVolumeSpecName: "kube-api-access-rmcvp") pod "342edac6-5fe9-45d6-9d37-2bc1ed959d23" (UID: "342edac6-5fe9-45d6-9d37-2bc1ed959d23"). InnerVolumeSpecName "kube-api-access-rmcvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:27:33 crc kubenswrapper[4728]: I1216 15:27:33.886353 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "342edac6-5fe9-45d6-9d37-2bc1ed959d23" (UID: "342edac6-5fe9-45d6-9d37-2bc1ed959d23"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:27:33 crc kubenswrapper[4728]: I1216 15:27:33.889922 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-inventory" (OuterVolumeSpecName: "inventory") pod "342edac6-5fe9-45d6-9d37-2bc1ed959d23" (UID: "342edac6-5fe9-45d6-9d37-2bc1ed959d23"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:27:33 crc kubenswrapper[4728]: I1216 15:27:33.958918 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:33 crc kubenswrapper[4728]: I1216 15:27:33.959237 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmcvp\" (UniqueName: \"kubernetes.io/projected/342edac6-5fe9-45d6-9d37-2bc1ed959d23-kube-api-access-rmcvp\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:33 crc kubenswrapper[4728]: I1216 15:27:33.959250 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/342edac6-5fe9-45d6-9d37-2bc1ed959d23-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.261339 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" event={"ID":"342edac6-5fe9-45d6-9d37-2bc1ed959d23","Type":"ContainerDied","Data":"52a47c9cb695fb2ea211fc57103bd8acaab497a1f09b6f74976ff703f6bc2d44"} Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.261402 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52a47c9cb695fb2ea211fc57103bd8acaab497a1f09b6f74976ff703f6bc2d44" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.261521 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.359302 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9gcfz"] Dec 16 15:27:34 crc kubenswrapper[4728]: E1216 15:27:34.359970 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="342edac6-5fe9-45d6-9d37-2bc1ed959d23" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.360006 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="342edac6-5fe9-45d6-9d37-2bc1ed959d23" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.360356 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="342edac6-5fe9-45d6-9d37-2bc1ed959d23" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.361357 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.364613 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.364837 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.365980 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.366367 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.371804 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9gcfz"] Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.468059 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bnt4\" (UniqueName: \"kubernetes.io/projected/a9b27bc6-f730-4cc8-a626-de82d2c022b8-kube-api-access-4bnt4\") pod \"ssh-known-hosts-edpm-deployment-9gcfz\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.468390 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9gcfz\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.468573 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9gcfz\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.570130 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9gcfz\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.570254 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bnt4\" (UniqueName: \"kubernetes.io/projected/a9b27bc6-f730-4cc8-a626-de82d2c022b8-kube-api-access-4bnt4\") pod \"ssh-known-hosts-edpm-deployment-9gcfz\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.570363 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9gcfz\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.580639 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9gcfz\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.584271 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9gcfz\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.600163 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bnt4\" (UniqueName: \"kubernetes.io/projected/a9b27bc6-f730-4cc8-a626-de82d2c022b8-kube-api-access-4bnt4\") pod \"ssh-known-hosts-edpm-deployment-9gcfz\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:34 crc kubenswrapper[4728]: I1216 15:27:34.699477 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:35 crc kubenswrapper[4728]: I1216 15:27:35.320188 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9gcfz"] Dec 16 15:27:35 crc kubenswrapper[4728]: I1216 15:27:35.321795 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:27:36 crc kubenswrapper[4728]: I1216 15:27:36.285323 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" event={"ID":"a9b27bc6-f730-4cc8-a626-de82d2c022b8","Type":"ContainerStarted","Data":"7e38a5af7250b12e1a809d56b72ca25b0ed2185d441cf5bd1f4eeb0c641cf481"} Dec 16 15:27:37 crc kubenswrapper[4728]: I1216 15:27:37.297470 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" event={"ID":"a9b27bc6-f730-4cc8-a626-de82d2c022b8","Type":"ContainerStarted","Data":"9da64ad68325699e7268959a1c9a83b09ea93ad8a3bf668ada5a77e13efbc53b"} Dec 16 15:27:37 crc kubenswrapper[4728]: I1216 15:27:37.323227 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" podStartSLOduration=2.169429083 podStartE2EDuration="3.323194631s" podCreationTimestamp="2025-12-16 15:27:34 +0000 UTC" firstStartedPulling="2025-12-16 15:27:35.321527264 +0000 UTC m=+1836.161706258" lastFinishedPulling="2025-12-16 15:27:36.475292782 +0000 UTC m=+1837.315471806" observedRunningTime="2025-12-16 15:27:37.321982449 +0000 UTC m=+1838.162161473" watchObservedRunningTime="2025-12-16 15:27:37.323194631 +0000 UTC m=+1838.163373655" Dec 16 15:27:40 crc kubenswrapper[4728]: I1216 15:27:40.507114 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:27:40 crc kubenswrapper[4728]: E1216 15:27:40.508064 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:27:44 crc kubenswrapper[4728]: I1216 15:27:44.377102 4728 generic.go:334] "Generic (PLEG): container finished" podID="a9b27bc6-f730-4cc8-a626-de82d2c022b8" containerID="9da64ad68325699e7268959a1c9a83b09ea93ad8a3bf668ada5a77e13efbc53b" exitCode=0 Dec 16 15:27:44 crc kubenswrapper[4728]: I1216 15:27:44.377260 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" event={"ID":"a9b27bc6-f730-4cc8-a626-de82d2c022b8","Type":"ContainerDied","Data":"9da64ad68325699e7268959a1c9a83b09ea93ad8a3bf668ada5a77e13efbc53b"} Dec 16 15:27:45 crc kubenswrapper[4728]: I1216 15:27:45.947778 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:45 crc kubenswrapper[4728]: I1216 15:27:45.955145 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-ssh-key-openstack-edpm-ipam\") pod \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " Dec 16 15:27:45 crc kubenswrapper[4728]: I1216 15:27:45.955356 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-inventory-0\") pod \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " Dec 16 15:27:45 crc kubenswrapper[4728]: I1216 15:27:45.955672 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bnt4\" (UniqueName: \"kubernetes.io/projected/a9b27bc6-f730-4cc8-a626-de82d2c022b8-kube-api-access-4bnt4\") pod \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\" (UID: \"a9b27bc6-f730-4cc8-a626-de82d2c022b8\") " Dec 16 15:27:45 crc kubenswrapper[4728]: I1216 15:27:45.962861 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b27bc6-f730-4cc8-a626-de82d2c022b8-kube-api-access-4bnt4" (OuterVolumeSpecName: "kube-api-access-4bnt4") pod "a9b27bc6-f730-4cc8-a626-de82d2c022b8" (UID: "a9b27bc6-f730-4cc8-a626-de82d2c022b8"). InnerVolumeSpecName "kube-api-access-4bnt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:27:45 crc kubenswrapper[4728]: I1216 15:27:45.992953 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "a9b27bc6-f730-4cc8-a626-de82d2c022b8" (UID: "a9b27bc6-f730-4cc8-a626-de82d2c022b8"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:27:45 crc kubenswrapper[4728]: I1216 15:27:45.996724 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a9b27bc6-f730-4cc8-a626-de82d2c022b8" (UID: "a9b27bc6-f730-4cc8-a626-de82d2c022b8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.057294 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bnt4\" (UniqueName: \"kubernetes.io/projected/a9b27bc6-f730-4cc8-a626-de82d2c022b8-kube-api-access-4bnt4\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.057335 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.057349 4728 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a9b27bc6-f730-4cc8-a626-de82d2c022b8-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.398080 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" event={"ID":"a9b27bc6-f730-4cc8-a626-de82d2c022b8","Type":"ContainerDied","Data":"7e38a5af7250b12e1a809d56b72ca25b0ed2185d441cf5bd1f4eeb0c641cf481"} Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.398298 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e38a5af7250b12e1a809d56b72ca25b0ed2185d441cf5bd1f4eeb0c641cf481" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.398153 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9gcfz" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.526614 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk"] Dec 16 15:27:46 crc kubenswrapper[4728]: E1216 15:27:46.527285 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b27bc6-f730-4cc8-a626-de82d2c022b8" containerName="ssh-known-hosts-edpm-deployment" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.527318 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b27bc6-f730-4cc8-a626-de82d2c022b8" containerName="ssh-known-hosts-edpm-deployment" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.527724 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b27bc6-f730-4cc8-a626-de82d2c022b8" containerName="ssh-known-hosts-edpm-deployment" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.528910 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.533019 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.533252 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.533263 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.533266 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.547100 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk"] Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.566788 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6dfd\" (UniqueName: \"kubernetes.io/projected/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-kube-api-access-d6dfd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpfkk\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.567136 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpfkk\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.567224 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpfkk\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.668803 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6dfd\" (UniqueName: \"kubernetes.io/projected/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-kube-api-access-d6dfd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpfkk\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.668957 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpfkk\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.669012 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpfkk\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.674723 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpfkk\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.678834 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpfkk\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.705400 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6dfd\" (UniqueName: \"kubernetes.io/projected/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-kube-api-access-d6dfd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qpfkk\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:46 crc kubenswrapper[4728]: I1216 15:27:46.859696 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:47 crc kubenswrapper[4728]: I1216 15:27:47.288170 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk"] Dec 16 15:27:47 crc kubenswrapper[4728]: I1216 15:27:47.409555 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" event={"ID":"6270b5fc-f711-41e7-b66c-1cac1f2f3b43","Type":"ContainerStarted","Data":"34cb6aa9bff4488d5fead19d1065b379fa313b96af8ec8464c2f5f232428c68b"} Dec 16 15:27:48 crc kubenswrapper[4728]: I1216 15:27:48.429899 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" event={"ID":"6270b5fc-f711-41e7-b66c-1cac1f2f3b43","Type":"ContainerStarted","Data":"bb45266c4f3f36aadbcfc01973be6553023ad8ad0e6869c0503da3a5be122b2d"} Dec 16 15:27:48 crc kubenswrapper[4728]: I1216 15:27:48.463601 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" podStartSLOduration=1.725566233 podStartE2EDuration="2.463585076s" podCreationTimestamp="2025-12-16 15:27:46 +0000 UTC" firstStartedPulling="2025-12-16 15:27:47.293596214 +0000 UTC m=+1848.133775218" lastFinishedPulling="2025-12-16 15:27:48.031615047 +0000 UTC m=+1848.871794061" observedRunningTime="2025-12-16 15:27:48.461022705 +0000 UTC m=+1849.301201719" watchObservedRunningTime="2025-12-16 15:27:48.463585076 +0000 UTC m=+1849.303764060" Dec 16 15:27:55 crc kubenswrapper[4728]: I1216 15:27:55.507206 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:27:55 crc kubenswrapper[4728]: E1216 15:27:55.508392 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:27:57 crc kubenswrapper[4728]: I1216 15:27:57.524176 4728 generic.go:334] "Generic (PLEG): container finished" podID="6270b5fc-f711-41e7-b66c-1cac1f2f3b43" containerID="bb45266c4f3f36aadbcfc01973be6553023ad8ad0e6869c0503da3a5be122b2d" exitCode=0 Dec 16 15:27:57 crc kubenswrapper[4728]: I1216 15:27:57.524275 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" event={"ID":"6270b5fc-f711-41e7-b66c-1cac1f2f3b43","Type":"ContainerDied","Data":"bb45266c4f3f36aadbcfc01973be6553023ad8ad0e6869c0503da3a5be122b2d"} Dec 16 15:27:58 crc kubenswrapper[4728]: I1216 15:27:58.963859 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.101621 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-inventory\") pod \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.101711 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-ssh-key\") pod \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.101842 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6dfd\" (UniqueName: \"kubernetes.io/projected/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-kube-api-access-d6dfd\") pod \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\" (UID: \"6270b5fc-f711-41e7-b66c-1cac1f2f3b43\") " Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.108337 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-kube-api-access-d6dfd" (OuterVolumeSpecName: "kube-api-access-d6dfd") pod "6270b5fc-f711-41e7-b66c-1cac1f2f3b43" (UID: "6270b5fc-f711-41e7-b66c-1cac1f2f3b43"). InnerVolumeSpecName "kube-api-access-d6dfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.126918 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-inventory" (OuterVolumeSpecName: "inventory") pod "6270b5fc-f711-41e7-b66c-1cac1f2f3b43" (UID: "6270b5fc-f711-41e7-b66c-1cac1f2f3b43"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.147723 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6270b5fc-f711-41e7-b66c-1cac1f2f3b43" (UID: "6270b5fc-f711-41e7-b66c-1cac1f2f3b43"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.204934 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.204976 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.204990 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6dfd\" (UniqueName: \"kubernetes.io/projected/6270b5fc-f711-41e7-b66c-1cac1f2f3b43-kube-api-access-d6dfd\") on node \"crc\" DevicePath \"\"" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.547796 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" event={"ID":"6270b5fc-f711-41e7-b66c-1cac1f2f3b43","Type":"ContainerDied","Data":"34cb6aa9bff4488d5fead19d1065b379fa313b96af8ec8464c2f5f232428c68b"} Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.547852 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34cb6aa9bff4488d5fead19d1065b379fa313b96af8ec8464c2f5f232428c68b" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.548503 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qpfkk" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.654267 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79"] Dec 16 15:27:59 crc kubenswrapper[4728]: E1216 15:27:59.654927 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6270b5fc-f711-41e7-b66c-1cac1f2f3b43" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.654944 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6270b5fc-f711-41e7-b66c-1cac1f2f3b43" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.655156 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6270b5fc-f711-41e7-b66c-1cac1f2f3b43" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.655857 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.659297 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.659979 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.660077 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.660154 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.665991 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79"] Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.816163 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.816233 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz9dd\" (UniqueName: \"kubernetes.io/projected/8154d34c-28e4-4d89-a271-f1b2fb4daa29-kube-api-access-bz9dd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.816266 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.918018 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.918192 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz9dd\" (UniqueName: \"kubernetes.io/projected/8154d34c-28e4-4d89-a271-f1b2fb4daa29-kube-api-access-bz9dd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.918686 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.923428 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.931100 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.940518 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz9dd\" (UniqueName: \"kubernetes.io/projected/8154d34c-28e4-4d89-a271-f1b2fb4daa29-kube-api-access-bz9dd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:27:59 crc kubenswrapper[4728]: I1216 15:27:59.981841 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:28:00 crc kubenswrapper[4728]: I1216 15:28:00.557632 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79"] Dec 16 15:28:01 crc kubenswrapper[4728]: I1216 15:28:01.567929 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" event={"ID":"8154d34c-28e4-4d89-a271-f1b2fb4daa29","Type":"ContainerStarted","Data":"6a8a95761db284f1c411c05d7c1c096a747a4abde345d884030c65dce191b16e"} Dec 16 15:28:03 crc kubenswrapper[4728]: I1216 15:28:03.605158 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" event={"ID":"8154d34c-28e4-4d89-a271-f1b2fb4daa29","Type":"ContainerStarted","Data":"8fb47984f2abee90dd679b02a0c6fe8f23a3a317b09f697ef50b636611fd003c"} Dec 16 15:28:03 crc kubenswrapper[4728]: I1216 15:28:03.638057 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" podStartSLOduration=2.8329775 podStartE2EDuration="4.638035139s" podCreationTimestamp="2025-12-16 15:27:59 +0000 UTC" firstStartedPulling="2025-12-16 15:28:00.571777542 +0000 UTC m=+1861.411956566" lastFinishedPulling="2025-12-16 15:28:02.376835181 +0000 UTC m=+1863.217014205" observedRunningTime="2025-12-16 15:28:03.627957933 +0000 UTC m=+1864.468136937" watchObservedRunningTime="2025-12-16 15:28:03.638035139 +0000 UTC m=+1864.478214133" Dec 16 15:28:09 crc kubenswrapper[4728]: I1216 15:28:09.517662 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:28:09 crc kubenswrapper[4728]: E1216 15:28:09.519018 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:28:12 crc kubenswrapper[4728]: I1216 15:28:12.708024 4728 generic.go:334] "Generic (PLEG): container finished" podID="8154d34c-28e4-4d89-a271-f1b2fb4daa29" containerID="8fb47984f2abee90dd679b02a0c6fe8f23a3a317b09f697ef50b636611fd003c" exitCode=0 Dec 16 15:28:12 crc kubenswrapper[4728]: I1216 15:28:12.708155 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" event={"ID":"8154d34c-28e4-4d89-a271-f1b2fb4daa29","Type":"ContainerDied","Data":"8fb47984f2abee90dd679b02a0c6fe8f23a3a317b09f697ef50b636611fd003c"} Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.126401 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.226862 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz9dd\" (UniqueName: \"kubernetes.io/projected/8154d34c-28e4-4d89-a271-f1b2fb4daa29-kube-api-access-bz9dd\") pod \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.226988 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-ssh-key\") pod \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.227069 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-inventory\") pod \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\" (UID: \"8154d34c-28e4-4d89-a271-f1b2fb4daa29\") " Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.234570 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8154d34c-28e4-4d89-a271-f1b2fb4daa29-kube-api-access-bz9dd" (OuterVolumeSpecName: "kube-api-access-bz9dd") pod "8154d34c-28e4-4d89-a271-f1b2fb4daa29" (UID: "8154d34c-28e4-4d89-a271-f1b2fb4daa29"). InnerVolumeSpecName "kube-api-access-bz9dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.256533 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8154d34c-28e4-4d89-a271-f1b2fb4daa29" (UID: "8154d34c-28e4-4d89-a271-f1b2fb4daa29"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.277459 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-inventory" (OuterVolumeSpecName: "inventory") pod "8154d34c-28e4-4d89-a271-f1b2fb4daa29" (UID: "8154d34c-28e4-4d89-a271-f1b2fb4daa29"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.330323 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.330382 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz9dd\" (UniqueName: \"kubernetes.io/projected/8154d34c-28e4-4d89-a271-f1b2fb4daa29-kube-api-access-bz9dd\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.330438 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8154d34c-28e4-4d89-a271-f1b2fb4daa29-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.734973 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" event={"ID":"8154d34c-28e4-4d89-a271-f1b2fb4daa29","Type":"ContainerDied","Data":"6a8a95761db284f1c411c05d7c1c096a747a4abde345d884030c65dce191b16e"} Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.735032 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a8a95761db284f1c411c05d7c1c096a747a4abde345d884030c65dce191b16e" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.735070 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.858161 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5"] Dec 16 15:28:14 crc kubenswrapper[4728]: E1216 15:28:14.858715 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8154d34c-28e4-4d89-a271-f1b2fb4daa29" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.858749 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8154d34c-28e4-4d89-a271-f1b2fb4daa29" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.859037 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8154d34c-28e4-4d89-a271-f1b2fb4daa29" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.859938 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.863183 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.863507 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.863712 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.864720 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.866244 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.867245 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.870301 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.872480 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:28:14 crc kubenswrapper[4728]: I1216 15:28:14.886194 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5"] Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.047928 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.048776 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.049021 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.049180 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4w44\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-kube-api-access-b4w44\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.049385 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.049593 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.049740 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.049916 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.050100 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.050328 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.050559 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.050772 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.051004 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.051194 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.153772 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.153985 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.154149 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.154551 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.154679 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.154871 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.154928 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.155042 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.155083 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4w44\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-kube-api-access-b4w44\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.155139 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.155247 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.155299 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.155354 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.155460 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.161246 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.163057 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.163145 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.163538 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.163553 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.165440 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.166158 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.166859 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.172101 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.172215 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.173331 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.174698 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.192592 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.203053 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4w44\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-kube-api-access-b4w44\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:15 crc kubenswrapper[4728]: I1216 15:28:15.482484 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:28:16 crc kubenswrapper[4728]: I1216 15:28:16.067001 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5"] Dec 16 15:28:16 crc kubenswrapper[4728]: I1216 15:28:16.762604 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" event={"ID":"7238debe-2d46-40ca-b598-2011d69c375c","Type":"ContainerStarted","Data":"e5a0a8d49e918463a45e2aa8eb49cd62516d732439766f5b7debf1e90880b535"} Dec 16 15:28:17 crc kubenswrapper[4728]: I1216 15:28:17.792498 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" event={"ID":"7238debe-2d46-40ca-b598-2011d69c375c","Type":"ContainerStarted","Data":"6ad24d34df22db40c339e55f02e96da8fb1e7fd7542cf86f0b47e546060dba65"} Dec 16 15:28:17 crc kubenswrapper[4728]: I1216 15:28:17.838503 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" podStartSLOduration=2.979288616 podStartE2EDuration="3.838477725s" podCreationTimestamp="2025-12-16 15:28:14 +0000 UTC" firstStartedPulling="2025-12-16 15:28:16.067399742 +0000 UTC m=+1876.907578726" lastFinishedPulling="2025-12-16 15:28:16.926588821 +0000 UTC m=+1877.766767835" observedRunningTime="2025-12-16 15:28:17.819038447 +0000 UTC m=+1878.659217471" watchObservedRunningTime="2025-12-16 15:28:17.838477725 +0000 UTC m=+1878.678656739" Dec 16 15:28:18 crc kubenswrapper[4728]: I1216 15:28:18.087492 4728 scope.go:117] "RemoveContainer" containerID="0db533ee323b74194ca8a61dc93d345eff8a404954a5bee6967ce494aee23d4c" Dec 16 15:28:22 crc kubenswrapper[4728]: I1216 15:28:22.507687 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:28:22 crc kubenswrapper[4728]: E1216 15:28:22.508795 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:28:37 crc kubenswrapper[4728]: I1216 15:28:37.507247 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:28:37 crc kubenswrapper[4728]: E1216 15:28:37.510871 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.454697 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d25g6"] Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.457392 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.471015 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d25g6"] Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.583226 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-utilities\") pod \"redhat-marketplace-d25g6\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.583305 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7fb\" (UniqueName: \"kubernetes.io/projected/15423e36-670e-40e9-be21-d12c2ee702ed-kube-api-access-gj7fb\") pod \"redhat-marketplace-d25g6\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.583515 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-catalog-content\") pod \"redhat-marketplace-d25g6\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.685613 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7fb\" (UniqueName: \"kubernetes.io/projected/15423e36-670e-40e9-be21-d12c2ee702ed-kube-api-access-gj7fb\") pod \"redhat-marketplace-d25g6\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.685764 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-catalog-content\") pod \"redhat-marketplace-d25g6\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.685878 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-utilities\") pod \"redhat-marketplace-d25g6\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.686382 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-utilities\") pod \"redhat-marketplace-d25g6\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.686484 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-catalog-content\") pod \"redhat-marketplace-d25g6\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.704350 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7fb\" (UniqueName: \"kubernetes.io/projected/15423e36-670e-40e9-be21-d12c2ee702ed-kube-api-access-gj7fb\") pod \"redhat-marketplace-d25g6\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:45 crc kubenswrapper[4728]: I1216 15:28:45.783650 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:46 crc kubenswrapper[4728]: I1216 15:28:46.262260 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d25g6"] Dec 16 15:28:47 crc kubenswrapper[4728]: I1216 15:28:47.078233 4728 generic.go:334] "Generic (PLEG): container finished" podID="15423e36-670e-40e9-be21-d12c2ee702ed" containerID="12932000a9b5c9d8755f9725c8c1b31b6da3fbc6ef2831d2c6c9ba048ce75e9c" exitCode=0 Dec 16 15:28:47 crc kubenswrapper[4728]: I1216 15:28:47.078293 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d25g6" event={"ID":"15423e36-670e-40e9-be21-d12c2ee702ed","Type":"ContainerDied","Data":"12932000a9b5c9d8755f9725c8c1b31b6da3fbc6ef2831d2c6c9ba048ce75e9c"} Dec 16 15:28:47 crc kubenswrapper[4728]: I1216 15:28:47.078703 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d25g6" event={"ID":"15423e36-670e-40e9-be21-d12c2ee702ed","Type":"ContainerStarted","Data":"b687709385680a791b64176ea8b509d39006f300a085777c35c18c44dccd0f55"} Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.506960 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:28:48 crc kubenswrapper[4728]: E1216 15:28:48.508766 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.634444 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7m9kx"] Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.636327 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.643738 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7m9kx"] Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.746153 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-catalog-content\") pod \"community-operators-7m9kx\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.746219 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl25v\" (UniqueName: \"kubernetes.io/projected/a8f63e05-5efc-4490-a927-968097c9b134-kube-api-access-dl25v\") pod \"community-operators-7m9kx\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.746297 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-utilities\") pod \"community-operators-7m9kx\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.848304 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-catalog-content\") pod \"community-operators-7m9kx\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.848421 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl25v\" (UniqueName: \"kubernetes.io/projected/a8f63e05-5efc-4490-a927-968097c9b134-kube-api-access-dl25v\") pod \"community-operators-7m9kx\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.848542 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-utilities\") pod \"community-operators-7m9kx\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.849122 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-catalog-content\") pod \"community-operators-7m9kx\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.849130 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-utilities\") pod \"community-operators-7m9kx\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:48 crc kubenswrapper[4728]: I1216 15:28:48.891680 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl25v\" (UniqueName: \"kubernetes.io/projected/a8f63e05-5efc-4490-a927-968097c9b134-kube-api-access-dl25v\") pod \"community-operators-7m9kx\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:49 crc kubenswrapper[4728]: I1216 15:28:49.000113 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:49 crc kubenswrapper[4728]: I1216 15:28:49.096645 4728 generic.go:334] "Generic (PLEG): container finished" podID="15423e36-670e-40e9-be21-d12c2ee702ed" containerID="033fc5914711d27b0e0da16a2775999a28729053c741e5abe1bab5bd98fb6c0c" exitCode=0 Dec 16 15:28:49 crc kubenswrapper[4728]: I1216 15:28:49.096698 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d25g6" event={"ID":"15423e36-670e-40e9-be21-d12c2ee702ed","Type":"ContainerDied","Data":"033fc5914711d27b0e0da16a2775999a28729053c741e5abe1bab5bd98fb6c0c"} Dec 16 15:28:49 crc kubenswrapper[4728]: I1216 15:28:49.584627 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7m9kx"] Dec 16 15:28:50 crc kubenswrapper[4728]: I1216 15:28:50.105593 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d25g6" event={"ID":"15423e36-670e-40e9-be21-d12c2ee702ed","Type":"ContainerStarted","Data":"9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc"} Dec 16 15:28:50 crc kubenswrapper[4728]: I1216 15:28:50.107354 4728 generic.go:334] "Generic (PLEG): container finished" podID="a8f63e05-5efc-4490-a927-968097c9b134" containerID="d21198e3d5165442e66f5388dd93179b94f021a62ce03db2f6c2ceb91ff55748" exitCode=0 Dec 16 15:28:50 crc kubenswrapper[4728]: I1216 15:28:50.107415 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9kx" event={"ID":"a8f63e05-5efc-4490-a927-968097c9b134","Type":"ContainerDied","Data":"d21198e3d5165442e66f5388dd93179b94f021a62ce03db2f6c2ceb91ff55748"} Dec 16 15:28:50 crc kubenswrapper[4728]: I1216 15:28:50.107443 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9kx" event={"ID":"a8f63e05-5efc-4490-a927-968097c9b134","Type":"ContainerStarted","Data":"05f4ec0314fcccb86e5d68f778b36bb6bd015f2d958f3252fea7fff0c6dd9708"} Dec 16 15:28:50 crc kubenswrapper[4728]: I1216 15:28:50.135301 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d25g6" podStartSLOduration=2.637908591 podStartE2EDuration="5.135282351s" podCreationTimestamp="2025-12-16 15:28:45 +0000 UTC" firstStartedPulling="2025-12-16 15:28:47.080349997 +0000 UTC m=+1907.920528981" lastFinishedPulling="2025-12-16 15:28:49.577723757 +0000 UTC m=+1910.417902741" observedRunningTime="2025-12-16 15:28:50.130607553 +0000 UTC m=+1910.970786537" watchObservedRunningTime="2025-12-16 15:28:50.135282351 +0000 UTC m=+1910.975461325" Dec 16 15:28:51 crc kubenswrapper[4728]: I1216 15:28:51.119454 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9kx" event={"ID":"a8f63e05-5efc-4490-a927-968097c9b134","Type":"ContainerStarted","Data":"f2479d93ecbd4c77055b07c919f6879fabccff289b78dc9b50e3c16b7d715d4e"} Dec 16 15:28:52 crc kubenswrapper[4728]: I1216 15:28:52.133333 4728 generic.go:334] "Generic (PLEG): container finished" podID="a8f63e05-5efc-4490-a927-968097c9b134" containerID="f2479d93ecbd4c77055b07c919f6879fabccff289b78dc9b50e3c16b7d715d4e" exitCode=0 Dec 16 15:28:52 crc kubenswrapper[4728]: I1216 15:28:52.133460 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9kx" event={"ID":"a8f63e05-5efc-4490-a927-968097c9b134","Type":"ContainerDied","Data":"f2479d93ecbd4c77055b07c919f6879fabccff289b78dc9b50e3c16b7d715d4e"} Dec 16 15:28:53 crc kubenswrapper[4728]: I1216 15:28:53.147048 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9kx" event={"ID":"a8f63e05-5efc-4490-a927-968097c9b134","Type":"ContainerStarted","Data":"ccebf0e20e4f5c9759f985f0c49fa51dfa8520d2312030ca138c1c635933db5f"} Dec 16 15:28:53 crc kubenswrapper[4728]: I1216 15:28:53.181985 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7m9kx" podStartSLOduration=2.405816547 podStartE2EDuration="5.181965568s" podCreationTimestamp="2025-12-16 15:28:48 +0000 UTC" firstStartedPulling="2025-12-16 15:28:50.108878522 +0000 UTC m=+1910.949057506" lastFinishedPulling="2025-12-16 15:28:52.885027533 +0000 UTC m=+1913.725206527" observedRunningTime="2025-12-16 15:28:53.176031277 +0000 UTC m=+1914.016210261" watchObservedRunningTime="2025-12-16 15:28:53.181965568 +0000 UTC m=+1914.022144562" Dec 16 15:28:55 crc kubenswrapper[4728]: I1216 15:28:55.784664 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:55 crc kubenswrapper[4728]: I1216 15:28:55.785089 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:55 crc kubenswrapper[4728]: I1216 15:28:55.853436 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:56 crc kubenswrapper[4728]: I1216 15:28:56.241761 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:57 crc kubenswrapper[4728]: I1216 15:28:57.224496 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d25g6"] Dec 16 15:28:58 crc kubenswrapper[4728]: I1216 15:28:58.198807 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d25g6" podUID="15423e36-670e-40e9-be21-d12c2ee702ed" containerName="registry-server" containerID="cri-o://9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc" gracePeriod=2 Dec 16 15:28:58 crc kubenswrapper[4728]: I1216 15:28:58.750789 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:58 crc kubenswrapper[4728]: I1216 15:28:58.893235 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj7fb\" (UniqueName: \"kubernetes.io/projected/15423e36-670e-40e9-be21-d12c2ee702ed-kube-api-access-gj7fb\") pod \"15423e36-670e-40e9-be21-d12c2ee702ed\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " Dec 16 15:28:58 crc kubenswrapper[4728]: I1216 15:28:58.893427 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-utilities\") pod \"15423e36-670e-40e9-be21-d12c2ee702ed\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " Dec 16 15:28:58 crc kubenswrapper[4728]: I1216 15:28:58.893786 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-catalog-content\") pod \"15423e36-670e-40e9-be21-d12c2ee702ed\" (UID: \"15423e36-670e-40e9-be21-d12c2ee702ed\") " Dec 16 15:28:58 crc kubenswrapper[4728]: I1216 15:28:58.894992 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-utilities" (OuterVolumeSpecName: "utilities") pod "15423e36-670e-40e9-be21-d12c2ee702ed" (UID: "15423e36-670e-40e9-be21-d12c2ee702ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:28:58 crc kubenswrapper[4728]: I1216 15:28:58.904659 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15423e36-670e-40e9-be21-d12c2ee702ed-kube-api-access-gj7fb" (OuterVolumeSpecName: "kube-api-access-gj7fb") pod "15423e36-670e-40e9-be21-d12c2ee702ed" (UID: "15423e36-670e-40e9-be21-d12c2ee702ed"). InnerVolumeSpecName "kube-api-access-gj7fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:28:58 crc kubenswrapper[4728]: I1216 15:28:58.959150 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15423e36-670e-40e9-be21-d12c2ee702ed" (UID: "15423e36-670e-40e9-be21-d12c2ee702ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:28:58 crc kubenswrapper[4728]: I1216 15:28:58.996464 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:58 crc kubenswrapper[4728]: I1216 15:28:58.996509 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj7fb\" (UniqueName: \"kubernetes.io/projected/15423e36-670e-40e9-be21-d12c2ee702ed-kube-api-access-gj7fb\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:58 crc kubenswrapper[4728]: I1216 15:28:58.996539 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15423e36-670e-40e9-be21-d12c2ee702ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.001300 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.001376 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.055228 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.213582 4728 generic.go:334] "Generic (PLEG): container finished" podID="15423e36-670e-40e9-be21-d12c2ee702ed" containerID="9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc" exitCode=0 Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.213665 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d25g6" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.213669 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d25g6" event={"ID":"15423e36-670e-40e9-be21-d12c2ee702ed","Type":"ContainerDied","Data":"9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc"} Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.213911 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d25g6" event={"ID":"15423e36-670e-40e9-be21-d12c2ee702ed","Type":"ContainerDied","Data":"b687709385680a791b64176ea8b509d39006f300a085777c35c18c44dccd0f55"} Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.213945 4728 scope.go:117] "RemoveContainer" containerID="9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.218260 4728 generic.go:334] "Generic (PLEG): container finished" podID="7238debe-2d46-40ca-b598-2011d69c375c" containerID="6ad24d34df22db40c339e55f02e96da8fb1e7fd7542cf86f0b47e546060dba65" exitCode=0 Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.218343 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" event={"ID":"7238debe-2d46-40ca-b598-2011d69c375c","Type":"ContainerDied","Data":"6ad24d34df22db40c339e55f02e96da8fb1e7fd7542cf86f0b47e546060dba65"} Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.245016 4728 scope.go:117] "RemoveContainer" containerID="033fc5914711d27b0e0da16a2775999a28729053c741e5abe1bab5bd98fb6c0c" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.283454 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d25g6"] Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.297862 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d25g6"] Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.306020 4728 scope.go:117] "RemoveContainer" containerID="12932000a9b5c9d8755f9725c8c1b31b6da3fbc6ef2831d2c6c9ba048ce75e9c" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.309038 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.341438 4728 scope.go:117] "RemoveContainer" containerID="9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc" Dec 16 15:28:59 crc kubenswrapper[4728]: E1216 15:28:59.342024 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc\": container with ID starting with 9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc not found: ID does not exist" containerID="9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.342073 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc"} err="failed to get container status \"9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc\": rpc error: code = NotFound desc = could not find container \"9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc\": container with ID starting with 9a6a06dab8efba90e732d106c13ebf500ce232c3b3614a0796f997535022a4fc not found: ID does not exist" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.342103 4728 scope.go:117] "RemoveContainer" containerID="033fc5914711d27b0e0da16a2775999a28729053c741e5abe1bab5bd98fb6c0c" Dec 16 15:28:59 crc kubenswrapper[4728]: E1216 15:28:59.342747 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"033fc5914711d27b0e0da16a2775999a28729053c741e5abe1bab5bd98fb6c0c\": container with ID starting with 033fc5914711d27b0e0da16a2775999a28729053c741e5abe1bab5bd98fb6c0c not found: ID does not exist" containerID="033fc5914711d27b0e0da16a2775999a28729053c741e5abe1bab5bd98fb6c0c" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.342789 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033fc5914711d27b0e0da16a2775999a28729053c741e5abe1bab5bd98fb6c0c"} err="failed to get container status \"033fc5914711d27b0e0da16a2775999a28729053c741e5abe1bab5bd98fb6c0c\": rpc error: code = NotFound desc = could not find container \"033fc5914711d27b0e0da16a2775999a28729053c741e5abe1bab5bd98fb6c0c\": container with ID starting with 033fc5914711d27b0e0da16a2775999a28729053c741e5abe1bab5bd98fb6c0c not found: ID does not exist" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.342817 4728 scope.go:117] "RemoveContainer" containerID="12932000a9b5c9d8755f9725c8c1b31b6da3fbc6ef2831d2c6c9ba048ce75e9c" Dec 16 15:28:59 crc kubenswrapper[4728]: E1216 15:28:59.343092 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12932000a9b5c9d8755f9725c8c1b31b6da3fbc6ef2831d2c6c9ba048ce75e9c\": container with ID starting with 12932000a9b5c9d8755f9725c8c1b31b6da3fbc6ef2831d2c6c9ba048ce75e9c not found: ID does not exist" containerID="12932000a9b5c9d8755f9725c8c1b31b6da3fbc6ef2831d2c6c9ba048ce75e9c" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.343133 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12932000a9b5c9d8755f9725c8c1b31b6da3fbc6ef2831d2c6c9ba048ce75e9c"} err="failed to get container status \"12932000a9b5c9d8755f9725c8c1b31b6da3fbc6ef2831d2c6c9ba048ce75e9c\": rpc error: code = NotFound desc = could not find container \"12932000a9b5c9d8755f9725c8c1b31b6da3fbc6ef2831d2c6c9ba048ce75e9c\": container with ID starting with 12932000a9b5c9d8755f9725c8c1b31b6da3fbc6ef2831d2c6c9ba048ce75e9c not found: ID does not exist" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.525016 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:28:59 crc kubenswrapper[4728]: I1216 15:28:59.525592 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15423e36-670e-40e9-be21-d12c2ee702ed" path="/var/lib/kubelet/pods/15423e36-670e-40e9-be21-d12c2ee702ed/volumes" Dec 16 15:28:59 crc kubenswrapper[4728]: E1216 15:28:59.525674 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.703914 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.881089 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.881150 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ovn-combined-ca-bundle\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.881234 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.881259 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-libvirt-combined-ca-bundle\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.881298 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-telemetry-combined-ca-bundle\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.881323 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.881352 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-neutron-metadata-combined-ca-bundle\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.882301 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ssh-key\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.882333 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-nova-combined-ca-bundle\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.882354 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.882443 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4w44\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-kube-api-access-b4w44\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.882530 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-bootstrap-combined-ca-bundle\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.882549 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-repo-setup-combined-ca-bundle\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.882566 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-inventory\") pod \"7238debe-2d46-40ca-b598-2011d69c375c\" (UID: \"7238debe-2d46-40ca-b598-2011d69c375c\") " Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.888098 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.888238 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.888449 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.889291 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-kube-api-access-b4w44" (OuterVolumeSpecName: "kube-api-access-b4w44") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "kube-api-access-b4w44". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.890260 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.891177 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.891756 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.892087 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.892218 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.892888 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.894005 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.894709 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.925916 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-inventory" (OuterVolumeSpecName: "inventory") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.936138 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7238debe-2d46-40ca-b598-2011d69c375c" (UID: "7238debe-2d46-40ca-b598-2011d69c375c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.985943 4728 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.986200 4728 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.986335 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.986521 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.986656 4728 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.986790 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.986906 4728 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.987019 4728 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.987144 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.987279 4728 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.987397 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.987587 4728 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7238debe-2d46-40ca-b598-2011d69c375c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.987748 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:00 crc kubenswrapper[4728]: I1216 15:29:00.987919 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4w44\" (UniqueName: \"kubernetes.io/projected/7238debe-2d46-40ca-b598-2011d69c375c-kube-api-access-b4w44\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.242588 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" event={"ID":"7238debe-2d46-40ca-b598-2011d69c375c","Type":"ContainerDied","Data":"e5a0a8d49e918463a45e2aa8eb49cd62516d732439766f5b7debf1e90880b535"} Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.242917 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5a0a8d49e918463a45e2aa8eb49cd62516d732439766f5b7debf1e90880b535" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.242712 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.414110 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7m9kx"] Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.414450 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7m9kx" podUID="a8f63e05-5efc-4490-a927-968097c9b134" containerName="registry-server" containerID="cri-o://ccebf0e20e4f5c9759f985f0c49fa51dfa8520d2312030ca138c1c635933db5f" gracePeriod=2 Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.446480 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58"] Dec 16 15:29:01 crc kubenswrapper[4728]: E1216 15:29:01.447016 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7238debe-2d46-40ca-b598-2011d69c375c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.447039 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7238debe-2d46-40ca-b598-2011d69c375c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 16 15:29:01 crc kubenswrapper[4728]: E1216 15:29:01.447062 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15423e36-670e-40e9-be21-d12c2ee702ed" containerName="registry-server" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.447071 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="15423e36-670e-40e9-be21-d12c2ee702ed" containerName="registry-server" Dec 16 15:29:01 crc kubenswrapper[4728]: E1216 15:29:01.447119 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15423e36-670e-40e9-be21-d12c2ee702ed" containerName="extract-utilities" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.447128 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="15423e36-670e-40e9-be21-d12c2ee702ed" containerName="extract-utilities" Dec 16 15:29:01 crc kubenswrapper[4728]: E1216 15:29:01.447141 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15423e36-670e-40e9-be21-d12c2ee702ed" containerName="extract-content" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.447146 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="15423e36-670e-40e9-be21-d12c2ee702ed" containerName="extract-content" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.447336 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="15423e36-670e-40e9-be21-d12c2ee702ed" containerName="registry-server" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.447359 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7238debe-2d46-40ca-b598-2011d69c375c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.448272 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.452962 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.453151 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.453732 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.453769 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.453949 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.468730 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58"] Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.603107 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.603647 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.603823 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.604018 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mc7r\" (UniqueName: \"kubernetes.io/projected/2767eeb4-bf6d-4381-8277-c6d99cad99a5-kube-api-access-8mc7r\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.604190 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.706129 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mc7r\" (UniqueName: \"kubernetes.io/projected/2767eeb4-bf6d-4381-8277-c6d99cad99a5-kube-api-access-8mc7r\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.706741 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.706876 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.707029 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.707144 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.707778 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.710725 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.711472 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.712649 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.726302 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mc7r\" (UniqueName: \"kubernetes.io/projected/2767eeb4-bf6d-4381-8277-c6d99cad99a5-kube-api-access-8mc7r\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xxw58\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:01 crc kubenswrapper[4728]: I1216 15:29:01.766918 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.254537 4728 generic.go:334] "Generic (PLEG): container finished" podID="a8f63e05-5efc-4490-a927-968097c9b134" containerID="ccebf0e20e4f5c9759f985f0c49fa51dfa8520d2312030ca138c1c635933db5f" exitCode=0 Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.254680 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9kx" event={"ID":"a8f63e05-5efc-4490-a927-968097c9b134","Type":"ContainerDied","Data":"ccebf0e20e4f5c9759f985f0c49fa51dfa8520d2312030ca138c1c635933db5f"} Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.384850 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.394650 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58"] Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.523650 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-catalog-content\") pod \"a8f63e05-5efc-4490-a927-968097c9b134\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.523775 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl25v\" (UniqueName: \"kubernetes.io/projected/a8f63e05-5efc-4490-a927-968097c9b134-kube-api-access-dl25v\") pod \"a8f63e05-5efc-4490-a927-968097c9b134\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.523812 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-utilities\") pod \"a8f63e05-5efc-4490-a927-968097c9b134\" (UID: \"a8f63e05-5efc-4490-a927-968097c9b134\") " Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.526492 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-utilities" (OuterVolumeSpecName: "utilities") pod "a8f63e05-5efc-4490-a927-968097c9b134" (UID: "a8f63e05-5efc-4490-a927-968097c9b134"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.532107 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f63e05-5efc-4490-a927-968097c9b134-kube-api-access-dl25v" (OuterVolumeSpecName: "kube-api-access-dl25v") pod "a8f63e05-5efc-4490-a927-968097c9b134" (UID: "a8f63e05-5efc-4490-a927-968097c9b134"). InnerVolumeSpecName "kube-api-access-dl25v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.598384 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8f63e05-5efc-4490-a927-968097c9b134" (UID: "a8f63e05-5efc-4490-a927-968097c9b134"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.626910 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.626952 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl25v\" (UniqueName: \"kubernetes.io/projected/a8f63e05-5efc-4490-a927-968097c9b134-kube-api-access-dl25v\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:02 crc kubenswrapper[4728]: I1216 15:29:02.626964 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8f63e05-5efc-4490-a927-968097c9b134-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:29:03 crc kubenswrapper[4728]: I1216 15:29:03.265935 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7m9kx" event={"ID":"a8f63e05-5efc-4490-a927-968097c9b134","Type":"ContainerDied","Data":"05f4ec0314fcccb86e5d68f778b36bb6bd015f2d958f3252fea7fff0c6dd9708"} Dec 16 15:29:03 crc kubenswrapper[4728]: I1216 15:29:03.266273 4728 scope.go:117] "RemoveContainer" containerID="ccebf0e20e4f5c9759f985f0c49fa51dfa8520d2312030ca138c1c635933db5f" Dec 16 15:29:03 crc kubenswrapper[4728]: I1216 15:29:03.268920 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7m9kx" Dec 16 15:29:03 crc kubenswrapper[4728]: I1216 15:29:03.269656 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" event={"ID":"2767eeb4-bf6d-4381-8277-c6d99cad99a5","Type":"ContainerStarted","Data":"ea6acc16ac30b19451f4c62a94230eb09dd0993d44cb05fec8855dcbf477aaf6"} Dec 16 15:29:03 crc kubenswrapper[4728]: I1216 15:29:03.269714 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" event={"ID":"2767eeb4-bf6d-4381-8277-c6d99cad99a5","Type":"ContainerStarted","Data":"04c505735b21888685c1ffe4dfb3493f0e1ea15f4aa340fe13a38e847a2aab83"} Dec 16 15:29:03 crc kubenswrapper[4728]: I1216 15:29:03.299480 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" podStartSLOduration=1.8506036909999999 podStartE2EDuration="2.299463618s" podCreationTimestamp="2025-12-16 15:29:01 +0000 UTC" firstStartedPulling="2025-12-16 15:29:02.391275679 +0000 UTC m=+1923.231454663" lastFinishedPulling="2025-12-16 15:29:02.840135576 +0000 UTC m=+1923.680314590" observedRunningTime="2025-12-16 15:29:03.294280477 +0000 UTC m=+1924.134459491" watchObservedRunningTime="2025-12-16 15:29:03.299463618 +0000 UTC m=+1924.139642602" Dec 16 15:29:03 crc kubenswrapper[4728]: I1216 15:29:03.300100 4728 scope.go:117] "RemoveContainer" containerID="f2479d93ecbd4c77055b07c919f6879fabccff289b78dc9b50e3c16b7d715d4e" Dec 16 15:29:03 crc kubenswrapper[4728]: I1216 15:29:03.324718 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7m9kx"] Dec 16 15:29:03 crc kubenswrapper[4728]: I1216 15:29:03.337713 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7m9kx"] Dec 16 15:29:03 crc kubenswrapper[4728]: I1216 15:29:03.341712 4728 scope.go:117] "RemoveContainer" containerID="d21198e3d5165442e66f5388dd93179b94f021a62ce03db2f6c2ceb91ff55748" Dec 16 15:29:03 crc kubenswrapper[4728]: I1216 15:29:03.516704 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8f63e05-5efc-4490-a927-968097c9b134" path="/var/lib/kubelet/pods/a8f63e05-5efc-4490-a927-968097c9b134/volumes" Dec 16 15:29:14 crc kubenswrapper[4728]: I1216 15:29:14.507010 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:29:14 crc kubenswrapper[4728]: E1216 15:29:14.509000 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:29:27 crc kubenswrapper[4728]: I1216 15:29:27.507659 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:29:27 crc kubenswrapper[4728]: E1216 15:29:27.508607 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:29:38 crc kubenswrapper[4728]: I1216 15:29:38.508419 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:29:38 crc kubenswrapper[4728]: E1216 15:29:38.509195 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:29:50 crc kubenswrapper[4728]: I1216 15:29:50.507826 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:29:50 crc kubenswrapper[4728]: E1216 15:29:50.509722 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.139464 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2"] Dec 16 15:30:00 crc kubenswrapper[4728]: E1216 15:30:00.140476 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f63e05-5efc-4490-a927-968097c9b134" containerName="extract-utilities" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.140495 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f63e05-5efc-4490-a927-968097c9b134" containerName="extract-utilities" Dec 16 15:30:00 crc kubenswrapper[4728]: E1216 15:30:00.140507 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f63e05-5efc-4490-a927-968097c9b134" containerName="registry-server" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.140514 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f63e05-5efc-4490-a927-968097c9b134" containerName="registry-server" Dec 16 15:30:00 crc kubenswrapper[4728]: E1216 15:30:00.140533 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f63e05-5efc-4490-a927-968097c9b134" containerName="extract-content" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.140541 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f63e05-5efc-4490-a927-968097c9b134" containerName="extract-content" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.140752 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f63e05-5efc-4490-a927-968097c9b134" containerName="registry-server" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.141464 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.143189 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.145526 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.152536 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2"] Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.315537 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-secret-volume\") pod \"collect-profiles-29431650-pktj2\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.315627 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s85kb\" (UniqueName: \"kubernetes.io/projected/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-kube-api-access-s85kb\") pod \"collect-profiles-29431650-pktj2\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.315720 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-config-volume\") pod \"collect-profiles-29431650-pktj2\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.417579 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-secret-volume\") pod \"collect-profiles-29431650-pktj2\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.417662 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s85kb\" (UniqueName: \"kubernetes.io/projected/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-kube-api-access-s85kb\") pod \"collect-profiles-29431650-pktj2\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.417702 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-config-volume\") pod \"collect-profiles-29431650-pktj2\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.419120 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-config-volume\") pod \"collect-profiles-29431650-pktj2\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.429448 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-secret-volume\") pod \"collect-profiles-29431650-pktj2\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.446186 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s85kb\" (UniqueName: \"kubernetes.io/projected/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-kube-api-access-s85kb\") pod \"collect-profiles-29431650-pktj2\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.469805 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:00 crc kubenswrapper[4728]: I1216 15:30:00.942088 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2"] Dec 16 15:30:01 crc kubenswrapper[4728]: I1216 15:30:01.003027 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" event={"ID":"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc","Type":"ContainerStarted","Data":"ef8e855a93df2d43a457b1bea0c296d258fb7e1350d7bf430d842c89c4a307cf"} Dec 16 15:30:02 crc kubenswrapper[4728]: I1216 15:30:02.013720 4728 generic.go:334] "Generic (PLEG): container finished" podID="f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc" containerID="f4e10a9444ee136be2a24bb7ef4865b7ae4b3057e60d1b99129724ce84ce9280" exitCode=0 Dec 16 15:30:02 crc kubenswrapper[4728]: I1216 15:30:02.013804 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" event={"ID":"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc","Type":"ContainerDied","Data":"f4e10a9444ee136be2a24bb7ef4865b7ae4b3057e60d1b99129724ce84ce9280"} Dec 16 15:30:02 crc kubenswrapper[4728]: I1216 15:30:02.506924 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:30:02 crc kubenswrapper[4728]: E1216 15:30:02.507185 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:30:03 crc kubenswrapper[4728]: I1216 15:30:03.329121 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:03 crc kubenswrapper[4728]: I1216 15:30:03.475895 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-secret-volume\") pod \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " Dec 16 15:30:03 crc kubenswrapper[4728]: I1216 15:30:03.476139 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-config-volume\") pod \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " Dec 16 15:30:03 crc kubenswrapper[4728]: I1216 15:30:03.476285 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s85kb\" (UniqueName: \"kubernetes.io/projected/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-kube-api-access-s85kb\") pod \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\" (UID: \"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc\") " Dec 16 15:30:03 crc kubenswrapper[4728]: I1216 15:30:03.477227 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-config-volume" (OuterVolumeSpecName: "config-volume") pod "f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc" (UID: "f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:30:03 crc kubenswrapper[4728]: I1216 15:30:03.482205 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc" (UID: "f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:03 crc kubenswrapper[4728]: I1216 15:30:03.482687 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-kube-api-access-s85kb" (OuterVolumeSpecName: "kube-api-access-s85kb") pod "f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc" (UID: "f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc"). InnerVolumeSpecName "kube-api-access-s85kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:30:03 crc kubenswrapper[4728]: I1216 15:30:03.578634 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:03 crc kubenswrapper[4728]: I1216 15:30:03.578677 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:03 crc kubenswrapper[4728]: I1216 15:30:03.578696 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s85kb\" (UniqueName: \"kubernetes.io/projected/f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc-kube-api-access-s85kb\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:04 crc kubenswrapper[4728]: I1216 15:30:04.030349 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" event={"ID":"f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc","Type":"ContainerDied","Data":"ef8e855a93df2d43a457b1bea0c296d258fb7e1350d7bf430d842c89c4a307cf"} Dec 16 15:30:04 crc kubenswrapper[4728]: I1216 15:30:04.030585 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef8e855a93df2d43a457b1bea0c296d258fb7e1350d7bf430d842c89c4a307cf" Dec 16 15:30:04 crc kubenswrapper[4728]: I1216 15:30:04.030442 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431650-pktj2" Dec 16 15:30:04 crc kubenswrapper[4728]: I1216 15:30:04.397924 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj"] Dec 16 15:30:04 crc kubenswrapper[4728]: I1216 15:30:04.404271 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431605-hwwmj"] Dec 16 15:30:05 crc kubenswrapper[4728]: I1216 15:30:05.521879 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0fa8699-bd81-4174-a0c1-6b9a3519ab0d" path="/var/lib/kubelet/pods/f0fa8699-bd81-4174-a0c1-6b9a3519ab0d/volumes" Dec 16 15:30:12 crc kubenswrapper[4728]: I1216 15:30:12.104244 4728 generic.go:334] "Generic (PLEG): container finished" podID="2767eeb4-bf6d-4381-8277-c6d99cad99a5" containerID="ea6acc16ac30b19451f4c62a94230eb09dd0993d44cb05fec8855dcbf477aaf6" exitCode=0 Dec 16 15:30:12 crc kubenswrapper[4728]: I1216 15:30:12.104337 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" event={"ID":"2767eeb4-bf6d-4381-8277-c6d99cad99a5","Type":"ContainerDied","Data":"ea6acc16ac30b19451f4c62a94230eb09dd0993d44cb05fec8855dcbf477aaf6"} Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.568588 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.666936 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-inventory\") pod \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.667515 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovncontroller-config-0\") pod \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.667592 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovn-combined-ca-bundle\") pod \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.667762 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ssh-key\") pod \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.667808 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mc7r\" (UniqueName: \"kubernetes.io/projected/2767eeb4-bf6d-4381-8277-c6d99cad99a5-kube-api-access-8mc7r\") pod \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\" (UID: \"2767eeb4-bf6d-4381-8277-c6d99cad99a5\") " Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.675232 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2767eeb4-bf6d-4381-8277-c6d99cad99a5" (UID: "2767eeb4-bf6d-4381-8277-c6d99cad99a5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.676066 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2767eeb4-bf6d-4381-8277-c6d99cad99a5-kube-api-access-8mc7r" (OuterVolumeSpecName: "kube-api-access-8mc7r") pod "2767eeb4-bf6d-4381-8277-c6d99cad99a5" (UID: "2767eeb4-bf6d-4381-8277-c6d99cad99a5"). InnerVolumeSpecName "kube-api-access-8mc7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.694327 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "2767eeb4-bf6d-4381-8277-c6d99cad99a5" (UID: "2767eeb4-bf6d-4381-8277-c6d99cad99a5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.695623 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-inventory" (OuterVolumeSpecName: "inventory") pod "2767eeb4-bf6d-4381-8277-c6d99cad99a5" (UID: "2767eeb4-bf6d-4381-8277-c6d99cad99a5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.695721 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2767eeb4-bf6d-4381-8277-c6d99cad99a5" (UID: "2767eeb4-bf6d-4381-8277-c6d99cad99a5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.770666 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.770711 4728 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.770722 4728 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.770730 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2767eeb4-bf6d-4381-8277-c6d99cad99a5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:13 crc kubenswrapper[4728]: I1216 15:30:13.770739 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mc7r\" (UniqueName: \"kubernetes.io/projected/2767eeb4-bf6d-4381-8277-c6d99cad99a5-kube-api-access-8mc7r\") on node \"crc\" DevicePath \"\"" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.127268 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" event={"ID":"2767eeb4-bf6d-4381-8277-c6d99cad99a5","Type":"ContainerDied","Data":"04c505735b21888685c1ffe4dfb3493f0e1ea15f4aa340fe13a38e847a2aab83"} Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.127311 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c505735b21888685c1ffe4dfb3493f0e1ea15f4aa340fe13a38e847a2aab83" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.127384 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xxw58" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.224383 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt"] Dec 16 15:30:14 crc kubenswrapper[4728]: E1216 15:30:14.224920 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc" containerName="collect-profiles" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.224942 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc" containerName="collect-profiles" Dec 16 15:30:14 crc kubenswrapper[4728]: E1216 15:30:14.224979 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2767eeb4-bf6d-4381-8277-c6d99cad99a5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.224988 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2767eeb4-bf6d-4381-8277-c6d99cad99a5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.225220 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="2767eeb4-bf6d-4381-8277-c6d99cad99a5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.225255 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="f593b380-fddc-4eeb-9fa2-6c8ed7bf72dc" containerName="collect-profiles" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.226003 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.228323 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.229025 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.229258 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.229309 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.229384 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.231174 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.241023 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt"] Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.383162 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.383245 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.383319 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.383362 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.383550 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmksw\" (UniqueName: \"kubernetes.io/projected/545f4f42-f672-4cd9-8050-296aa0dd57b8-kube-api-access-qmksw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.383722 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.486038 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.486118 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.486170 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.486205 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.486242 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmksw\" (UniqueName: \"kubernetes.io/projected/545f4f42-f672-4cd9-8050-296aa0dd57b8-kube-api-access-qmksw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.486282 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.491946 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.491992 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.492094 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.499531 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.499799 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.507339 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmksw\" (UniqueName: \"kubernetes.io/projected/545f4f42-f672-4cd9-8050-296aa0dd57b8-kube-api-access-qmksw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:14 crc kubenswrapper[4728]: I1216 15:30:14.540527 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:30:15 crc kubenswrapper[4728]: I1216 15:30:15.107136 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt"] Dec 16 15:30:15 crc kubenswrapper[4728]: I1216 15:30:15.138746 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" event={"ID":"545f4f42-f672-4cd9-8050-296aa0dd57b8","Type":"ContainerStarted","Data":"836f4d7a986c12ba5a1e4667b51b1187a69252b80c8f361e05bf74e3a1a3fa7f"} Dec 16 15:30:16 crc kubenswrapper[4728]: I1216 15:30:16.150550 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" event={"ID":"545f4f42-f672-4cd9-8050-296aa0dd57b8","Type":"ContainerStarted","Data":"89d9536fd5afb45a8284031d1fb91c23425b718808669f48d102c7dba95318b6"} Dec 16 15:30:16 crc kubenswrapper[4728]: I1216 15:30:16.184332 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" podStartSLOduration=1.45989579 podStartE2EDuration="2.184312531s" podCreationTimestamp="2025-12-16 15:30:14 +0000 UTC" firstStartedPulling="2025-12-16 15:30:15.111970168 +0000 UTC m=+1995.952149172" lastFinishedPulling="2025-12-16 15:30:15.836386929 +0000 UTC m=+1996.676565913" observedRunningTime="2025-12-16 15:30:16.170218748 +0000 UTC m=+1997.010397772" watchObservedRunningTime="2025-12-16 15:30:16.184312531 +0000 UTC m=+1997.024491525" Dec 16 15:30:17 crc kubenswrapper[4728]: I1216 15:30:17.506318 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:30:18 crc kubenswrapper[4728]: I1216 15:30:18.180476 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"56b7ccdd90eefd83a20f0141379b272c6901cc5c468dfd8cec9353a9a7e13a4b"} Dec 16 15:30:18 crc kubenswrapper[4728]: I1216 15:30:18.222727 4728 scope.go:117] "RemoveContainer" containerID="b2c0f614c2dbcb5a5db082689540734fe0a9834c6a5c241a4648c3a9d439210f" Dec 16 15:31:10 crc kubenswrapper[4728]: E1216 15:31:10.642242 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod545f4f42_f672_4cd9_8050_296aa0dd57b8.slice/crio-conmon-89d9536fd5afb45a8284031d1fb91c23425b718808669f48d102c7dba95318b6.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:31:10 crc kubenswrapper[4728]: I1216 15:31:10.825116 4728 generic.go:334] "Generic (PLEG): container finished" podID="545f4f42-f672-4cd9-8050-296aa0dd57b8" containerID="89d9536fd5afb45a8284031d1fb91c23425b718808669f48d102c7dba95318b6" exitCode=0 Dec 16 15:31:10 crc kubenswrapper[4728]: I1216 15:31:10.825183 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" event={"ID":"545f4f42-f672-4cd9-8050-296aa0dd57b8","Type":"ContainerDied","Data":"89d9536fd5afb45a8284031d1fb91c23425b718808669f48d102c7dba95318b6"} Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.316964 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.513180 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-inventory\") pod \"545f4f42-f672-4cd9-8050-296aa0dd57b8\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.513364 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-metadata-combined-ca-bundle\") pod \"545f4f42-f672-4cd9-8050-296aa0dd57b8\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.513553 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-nova-metadata-neutron-config-0\") pod \"545f4f42-f672-4cd9-8050-296aa0dd57b8\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.513627 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"545f4f42-f672-4cd9-8050-296aa0dd57b8\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.513664 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-ssh-key\") pod \"545f4f42-f672-4cd9-8050-296aa0dd57b8\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.513698 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmksw\" (UniqueName: \"kubernetes.io/projected/545f4f42-f672-4cd9-8050-296aa0dd57b8-kube-api-access-qmksw\") pod \"545f4f42-f672-4cd9-8050-296aa0dd57b8\" (UID: \"545f4f42-f672-4cd9-8050-296aa0dd57b8\") " Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.523887 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "545f4f42-f672-4cd9-8050-296aa0dd57b8" (UID: "545f4f42-f672-4cd9-8050-296aa0dd57b8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.525666 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/545f4f42-f672-4cd9-8050-296aa0dd57b8-kube-api-access-qmksw" (OuterVolumeSpecName: "kube-api-access-qmksw") pod "545f4f42-f672-4cd9-8050-296aa0dd57b8" (UID: "545f4f42-f672-4cd9-8050-296aa0dd57b8"). InnerVolumeSpecName "kube-api-access-qmksw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.564454 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "545f4f42-f672-4cd9-8050-296aa0dd57b8" (UID: "545f4f42-f672-4cd9-8050-296aa0dd57b8"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.564592 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-inventory" (OuterVolumeSpecName: "inventory") pod "545f4f42-f672-4cd9-8050-296aa0dd57b8" (UID: "545f4f42-f672-4cd9-8050-296aa0dd57b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.566834 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "545f4f42-f672-4cd9-8050-296aa0dd57b8" (UID: "545f4f42-f672-4cd9-8050-296aa0dd57b8"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.577889 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "545f4f42-f672-4cd9-8050-296aa0dd57b8" (UID: "545f4f42-f672-4cd9-8050-296aa0dd57b8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.617332 4728 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.617372 4728 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.617391 4728 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.617535 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.617556 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmksw\" (UniqueName: \"kubernetes.io/projected/545f4f42-f672-4cd9-8050-296aa0dd57b8-kube-api-access-qmksw\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.617572 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/545f4f42-f672-4cd9-8050-296aa0dd57b8-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.849440 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" event={"ID":"545f4f42-f672-4cd9-8050-296aa0dd57b8","Type":"ContainerDied","Data":"836f4d7a986c12ba5a1e4667b51b1187a69252b80c8f361e05bf74e3a1a3fa7f"} Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.849509 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="836f4d7a986c12ba5a1e4667b51b1187a69252b80c8f361e05bf74e3a1a3fa7f" Dec 16 15:31:12 crc kubenswrapper[4728]: I1216 15:31:12.849550 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.054972 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md"] Dec 16 15:31:13 crc kubenswrapper[4728]: E1216 15:31:13.056130 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545f4f42-f672-4cd9-8050-296aa0dd57b8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.056175 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="545f4f42-f672-4cd9-8050-296aa0dd57b8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.056555 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="545f4f42-f672-4cd9-8050-296aa0dd57b8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.057798 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.061017 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.061166 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.061205 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.061356 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.062870 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.068992 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md"] Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.231104 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.231222 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.231335 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.231483 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kjjx\" (UniqueName: \"kubernetes.io/projected/355982cb-601d-4505-926c-8fa80bd4f3b6-kube-api-access-4kjjx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.231595 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.334262 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.334450 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.334520 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.334579 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kjjx\" (UniqueName: \"kubernetes.io/projected/355982cb-601d-4505-926c-8fa80bd4f3b6-kube-api-access-4kjjx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.334718 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.340610 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.348807 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.349001 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.355783 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.371578 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kjjx\" (UniqueName: \"kubernetes.io/projected/355982cb-601d-4505-926c-8fa80bd4f3b6-kube-api-access-4kjjx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-fg5md\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:13 crc kubenswrapper[4728]: I1216 15:31:13.391088 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:31:14 crc kubenswrapper[4728]: I1216 15:31:14.058896 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md"] Dec 16 15:31:14 crc kubenswrapper[4728]: I1216 15:31:14.870592 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" event={"ID":"355982cb-601d-4505-926c-8fa80bd4f3b6","Type":"ContainerStarted","Data":"8a60ecde530e3222a1b72bb05a8ab86bdd71bf8af9b04fa427e6ec4ba331758b"} Dec 16 15:31:15 crc kubenswrapper[4728]: I1216 15:31:15.880828 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" event={"ID":"355982cb-601d-4505-926c-8fa80bd4f3b6","Type":"ContainerStarted","Data":"9ca6584ee3636ed25327a42327762f1e113d8fc974daf8f1fd67af268015a98e"} Dec 16 15:31:15 crc kubenswrapper[4728]: I1216 15:31:15.912474 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" podStartSLOduration=2.331832268 podStartE2EDuration="2.912444128s" podCreationTimestamp="2025-12-16 15:31:13 +0000 UTC" firstStartedPulling="2025-12-16 15:31:14.067078572 +0000 UTC m=+2054.907257566" lastFinishedPulling="2025-12-16 15:31:14.647690432 +0000 UTC m=+2055.487869426" observedRunningTime="2025-12-16 15:31:15.902347094 +0000 UTC m=+2056.742526118" watchObservedRunningTime="2025-12-16 15:31:15.912444128 +0000 UTC m=+2056.752623152" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.051109 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g6hh4"] Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.053396 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.067099 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g6hh4"] Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.129379 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-catalog-content\") pod \"redhat-operators-g6hh4\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.129736 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqd7z\" (UniqueName: \"kubernetes.io/projected/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-kube-api-access-rqd7z\") pod \"redhat-operators-g6hh4\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.129938 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-utilities\") pod \"redhat-operators-g6hh4\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.231185 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqd7z\" (UniqueName: \"kubernetes.io/projected/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-kube-api-access-rqd7z\") pod \"redhat-operators-g6hh4\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.231330 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-utilities\") pod \"redhat-operators-g6hh4\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.231351 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-catalog-content\") pod \"redhat-operators-g6hh4\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.231831 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-catalog-content\") pod \"redhat-operators-g6hh4\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.232259 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-utilities\") pod \"redhat-operators-g6hh4\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.252031 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqd7z\" (UniqueName: \"kubernetes.io/projected/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-kube-api-access-rqd7z\") pod \"redhat-operators-g6hh4\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.401640 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:03 crc kubenswrapper[4728]: I1216 15:32:03.891205 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g6hh4"] Dec 16 15:32:04 crc kubenswrapper[4728]: I1216 15:32:04.402981 4728 generic.go:334] "Generic (PLEG): container finished" podID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" containerID="9aa496ef305a1dba3eabbd50b3c5a6fcb231920344efb53a14af44c57d92223d" exitCode=0 Dec 16 15:32:04 crc kubenswrapper[4728]: I1216 15:32:04.403029 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g6hh4" event={"ID":"40ae9721-3a10-4dcb-ab87-6d85373fb2bb","Type":"ContainerDied","Data":"9aa496ef305a1dba3eabbd50b3c5a6fcb231920344efb53a14af44c57d92223d"} Dec 16 15:32:04 crc kubenswrapper[4728]: I1216 15:32:04.403277 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g6hh4" event={"ID":"40ae9721-3a10-4dcb-ab87-6d85373fb2bb","Type":"ContainerStarted","Data":"df33b5d83e50f52432c0922d6ec3a7407e6b49815bf7147c16e103be3bd6e0aa"} Dec 16 15:32:05 crc kubenswrapper[4728]: I1216 15:32:05.414730 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g6hh4" event={"ID":"40ae9721-3a10-4dcb-ab87-6d85373fb2bb","Type":"ContainerStarted","Data":"3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d"} Dec 16 15:32:06 crc kubenswrapper[4728]: I1216 15:32:06.427674 4728 generic.go:334] "Generic (PLEG): container finished" podID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" containerID="3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d" exitCode=0 Dec 16 15:32:06 crc kubenswrapper[4728]: I1216 15:32:06.427740 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g6hh4" event={"ID":"40ae9721-3a10-4dcb-ab87-6d85373fb2bb","Type":"ContainerDied","Data":"3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d"} Dec 16 15:32:07 crc kubenswrapper[4728]: I1216 15:32:07.438111 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g6hh4" event={"ID":"40ae9721-3a10-4dcb-ab87-6d85373fb2bb","Type":"ContainerStarted","Data":"2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c"} Dec 16 15:32:07 crc kubenswrapper[4728]: I1216 15:32:07.461752 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g6hh4" podStartSLOduration=1.930813041 podStartE2EDuration="4.461732362s" podCreationTimestamp="2025-12-16 15:32:03 +0000 UTC" firstStartedPulling="2025-12-16 15:32:04.404771564 +0000 UTC m=+2105.244950548" lastFinishedPulling="2025-12-16 15:32:06.935690865 +0000 UTC m=+2107.775869869" observedRunningTime="2025-12-16 15:32:07.455147572 +0000 UTC m=+2108.295326556" watchObservedRunningTime="2025-12-16 15:32:07.461732362 +0000 UTC m=+2108.301911346" Dec 16 15:32:13 crc kubenswrapper[4728]: I1216 15:32:13.402362 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:13 crc kubenswrapper[4728]: I1216 15:32:13.402810 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:13 crc kubenswrapper[4728]: I1216 15:32:13.460911 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:13 crc kubenswrapper[4728]: I1216 15:32:13.573779 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:13 crc kubenswrapper[4728]: I1216 15:32:13.717076 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g6hh4"] Dec 16 15:32:15 crc kubenswrapper[4728]: I1216 15:32:15.510296 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g6hh4" podUID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" containerName="registry-server" containerID="cri-o://2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c" gracePeriod=2 Dec 16 15:32:15 crc kubenswrapper[4728]: I1216 15:32:15.998458 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.141511 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-utilities\") pod \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.141843 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqd7z\" (UniqueName: \"kubernetes.io/projected/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-kube-api-access-rqd7z\") pod \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.141970 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-catalog-content\") pod \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\" (UID: \"40ae9721-3a10-4dcb-ab87-6d85373fb2bb\") " Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.142537 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-utilities" (OuterVolumeSpecName: "utilities") pod "40ae9721-3a10-4dcb-ab87-6d85373fb2bb" (UID: "40ae9721-3a10-4dcb-ab87-6d85373fb2bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.146635 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-kube-api-access-rqd7z" (OuterVolumeSpecName: "kube-api-access-rqd7z") pod "40ae9721-3a10-4dcb-ab87-6d85373fb2bb" (UID: "40ae9721-3a10-4dcb-ab87-6d85373fb2bb"). InnerVolumeSpecName "kube-api-access-rqd7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.244022 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqd7z\" (UniqueName: \"kubernetes.io/projected/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-kube-api-access-rqd7z\") on node \"crc\" DevicePath \"\"" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.244092 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.276112 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40ae9721-3a10-4dcb-ab87-6d85373fb2bb" (UID: "40ae9721-3a10-4dcb-ab87-6d85373fb2bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.345891 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40ae9721-3a10-4dcb-ab87-6d85373fb2bb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.520883 4728 generic.go:334] "Generic (PLEG): container finished" podID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" containerID="2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c" exitCode=0 Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.520965 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g6hh4" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.520943 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g6hh4" event={"ID":"40ae9721-3a10-4dcb-ab87-6d85373fb2bb","Type":"ContainerDied","Data":"2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c"} Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.521069 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g6hh4" event={"ID":"40ae9721-3a10-4dcb-ab87-6d85373fb2bb","Type":"ContainerDied","Data":"df33b5d83e50f52432c0922d6ec3a7407e6b49815bf7147c16e103be3bd6e0aa"} Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.521105 4728 scope.go:117] "RemoveContainer" containerID="2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.549822 4728 scope.go:117] "RemoveContainer" containerID="3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.580557 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g6hh4"] Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.586007 4728 scope.go:117] "RemoveContainer" containerID="9aa496ef305a1dba3eabbd50b3c5a6fcb231920344efb53a14af44c57d92223d" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.593200 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g6hh4"] Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.645993 4728 scope.go:117] "RemoveContainer" containerID="2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c" Dec 16 15:32:16 crc kubenswrapper[4728]: E1216 15:32:16.646643 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c\": container with ID starting with 2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c not found: ID does not exist" containerID="2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.646681 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c"} err="failed to get container status \"2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c\": rpc error: code = NotFound desc = could not find container \"2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c\": container with ID starting with 2bf688a67cde1bb1b6226e59dfd8a95e8b4ebbc00c4970916e7ca995eb14ab7c not found: ID does not exist" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.646708 4728 scope.go:117] "RemoveContainer" containerID="3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d" Dec 16 15:32:16 crc kubenswrapper[4728]: E1216 15:32:16.647105 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d\": container with ID starting with 3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d not found: ID does not exist" containerID="3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.647176 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d"} err="failed to get container status \"3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d\": rpc error: code = NotFound desc = could not find container \"3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d\": container with ID starting with 3b347fca37aee5976c617bd54df62f69bc8b6c6d26a2c91fe1856df40baea57d not found: ID does not exist" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.647219 4728 scope.go:117] "RemoveContainer" containerID="9aa496ef305a1dba3eabbd50b3c5a6fcb231920344efb53a14af44c57d92223d" Dec 16 15:32:16 crc kubenswrapper[4728]: E1216 15:32:16.647712 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aa496ef305a1dba3eabbd50b3c5a6fcb231920344efb53a14af44c57d92223d\": container with ID starting with 9aa496ef305a1dba3eabbd50b3c5a6fcb231920344efb53a14af44c57d92223d not found: ID does not exist" containerID="9aa496ef305a1dba3eabbd50b3c5a6fcb231920344efb53a14af44c57d92223d" Dec 16 15:32:16 crc kubenswrapper[4728]: I1216 15:32:16.647750 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa496ef305a1dba3eabbd50b3c5a6fcb231920344efb53a14af44c57d92223d"} err="failed to get container status \"9aa496ef305a1dba3eabbd50b3c5a6fcb231920344efb53a14af44c57d92223d\": rpc error: code = NotFound desc = could not find container \"9aa496ef305a1dba3eabbd50b3c5a6fcb231920344efb53a14af44c57d92223d\": container with ID starting with 9aa496ef305a1dba3eabbd50b3c5a6fcb231920344efb53a14af44c57d92223d not found: ID does not exist" Dec 16 15:32:17 crc kubenswrapper[4728]: I1216 15:32:17.524740 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" path="/var/lib/kubelet/pods/40ae9721-3a10-4dcb-ab87-6d85373fb2bb/volumes" Dec 16 15:32:38 crc kubenswrapper[4728]: I1216 15:32:38.819433 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:32:38 crc kubenswrapper[4728]: I1216 15:32:38.820101 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:33:08 crc kubenswrapper[4728]: I1216 15:33:08.818831 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:33:08 crc kubenswrapper[4728]: I1216 15:33:08.819548 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:33:38 crc kubenswrapper[4728]: I1216 15:33:38.818878 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:33:38 crc kubenswrapper[4728]: I1216 15:33:38.819374 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:33:38 crc kubenswrapper[4728]: I1216 15:33:38.819441 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:33:38 crc kubenswrapper[4728]: I1216 15:33:38.820200 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"56b7ccdd90eefd83a20f0141379b272c6901cc5c468dfd8cec9353a9a7e13a4b"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:33:38 crc kubenswrapper[4728]: I1216 15:33:38.820247 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://56b7ccdd90eefd83a20f0141379b272c6901cc5c468dfd8cec9353a9a7e13a4b" gracePeriod=600 Dec 16 15:33:39 crc kubenswrapper[4728]: I1216 15:33:39.734578 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="56b7ccdd90eefd83a20f0141379b272c6901cc5c468dfd8cec9353a9a7e13a4b" exitCode=0 Dec 16 15:33:39 crc kubenswrapper[4728]: I1216 15:33:39.734625 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"56b7ccdd90eefd83a20f0141379b272c6901cc5c468dfd8cec9353a9a7e13a4b"} Dec 16 15:33:39 crc kubenswrapper[4728]: I1216 15:33:39.735752 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5"} Dec 16 15:33:39 crc kubenswrapper[4728]: I1216 15:33:39.735854 4728 scope.go:117] "RemoveContainer" containerID="3a9d2f9a664537f0f17c03e9cc1a3221450068d51ba30e9f4f140014d91806e6" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.359921 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-htnjf"] Dec 16 15:34:19 crc kubenswrapper[4728]: E1216 15:34:19.360859 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" containerName="extract-content" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.360872 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" containerName="extract-content" Dec 16 15:34:19 crc kubenswrapper[4728]: E1216 15:34:19.360900 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" containerName="registry-server" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.360906 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" containerName="registry-server" Dec 16 15:34:19 crc kubenswrapper[4728]: E1216 15:34:19.360927 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" containerName="extract-utilities" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.360934 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" containerName="extract-utilities" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.361086 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ae9721-3a10-4dcb-ab87-6d85373fb2bb" containerName="registry-server" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.362300 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.400958 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-htnjf"] Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.484933 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-utilities\") pod \"certified-operators-htnjf\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.485663 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwrrp\" (UniqueName: \"kubernetes.io/projected/e3063b2e-3324-46e9-be50-f6e5707d558c-kube-api-access-mwrrp\") pod \"certified-operators-htnjf\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.485784 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-catalog-content\") pod \"certified-operators-htnjf\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.586697 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwrrp\" (UniqueName: \"kubernetes.io/projected/e3063b2e-3324-46e9-be50-f6e5707d558c-kube-api-access-mwrrp\") pod \"certified-operators-htnjf\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.587036 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-catalog-content\") pod \"certified-operators-htnjf\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.587241 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-utilities\") pod \"certified-operators-htnjf\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.587472 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-catalog-content\") pod \"certified-operators-htnjf\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.587603 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-utilities\") pod \"certified-operators-htnjf\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.615763 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwrrp\" (UniqueName: \"kubernetes.io/projected/e3063b2e-3324-46e9-be50-f6e5707d558c-kube-api-access-mwrrp\") pod \"certified-operators-htnjf\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:19 crc kubenswrapper[4728]: I1216 15:34:19.682763 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:20 crc kubenswrapper[4728]: I1216 15:34:20.316428 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-htnjf"] Dec 16 15:34:21 crc kubenswrapper[4728]: I1216 15:34:21.199802 4728 generic.go:334] "Generic (PLEG): container finished" podID="e3063b2e-3324-46e9-be50-f6e5707d558c" containerID="6a7b7d7122c42128553f201e94695aa5eafe170ac1106b9ed7e2808b0d1bcf34" exitCode=0 Dec 16 15:34:21 crc kubenswrapper[4728]: I1216 15:34:21.200012 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htnjf" event={"ID":"e3063b2e-3324-46e9-be50-f6e5707d558c","Type":"ContainerDied","Data":"6a7b7d7122c42128553f201e94695aa5eafe170ac1106b9ed7e2808b0d1bcf34"} Dec 16 15:34:21 crc kubenswrapper[4728]: I1216 15:34:21.200128 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htnjf" event={"ID":"e3063b2e-3324-46e9-be50-f6e5707d558c","Type":"ContainerStarted","Data":"de2616cee914d5e4afae0f47c207a5408bdf60fa0304e1e2f4e7c4a1d6c3d510"} Dec 16 15:34:21 crc kubenswrapper[4728]: I1216 15:34:21.205334 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:34:22 crc kubenswrapper[4728]: I1216 15:34:22.211626 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htnjf" event={"ID":"e3063b2e-3324-46e9-be50-f6e5707d558c","Type":"ContainerStarted","Data":"9927ea3fa980d1f69149b6eb6745abf34bf6f87c33898d1cfd3cff78d212d90d"} Dec 16 15:34:23 crc kubenswrapper[4728]: I1216 15:34:23.222089 4728 generic.go:334] "Generic (PLEG): container finished" podID="e3063b2e-3324-46e9-be50-f6e5707d558c" containerID="9927ea3fa980d1f69149b6eb6745abf34bf6f87c33898d1cfd3cff78d212d90d" exitCode=0 Dec 16 15:34:23 crc kubenswrapper[4728]: I1216 15:34:23.222162 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htnjf" event={"ID":"e3063b2e-3324-46e9-be50-f6e5707d558c","Type":"ContainerDied","Data":"9927ea3fa980d1f69149b6eb6745abf34bf6f87c33898d1cfd3cff78d212d90d"} Dec 16 15:34:24 crc kubenswrapper[4728]: I1216 15:34:24.231929 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htnjf" event={"ID":"e3063b2e-3324-46e9-be50-f6e5707d558c","Type":"ContainerStarted","Data":"086991843b4acaa3776fcd8b151e67a68e282185e9477a356c4100f199890d97"} Dec 16 15:34:24 crc kubenswrapper[4728]: I1216 15:34:24.251346 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-htnjf" podStartSLOduration=2.568305249 podStartE2EDuration="5.251328919s" podCreationTimestamp="2025-12-16 15:34:19 +0000 UTC" firstStartedPulling="2025-12-16 15:34:21.202929965 +0000 UTC m=+2242.043108969" lastFinishedPulling="2025-12-16 15:34:23.885953655 +0000 UTC m=+2244.726132639" observedRunningTime="2025-12-16 15:34:24.250957969 +0000 UTC m=+2245.091136973" watchObservedRunningTime="2025-12-16 15:34:24.251328919 +0000 UTC m=+2245.091507903" Dec 16 15:34:29 crc kubenswrapper[4728]: I1216 15:34:29.683483 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:29 crc kubenswrapper[4728]: I1216 15:34:29.684069 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:29 crc kubenswrapper[4728]: I1216 15:34:29.723876 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:30 crc kubenswrapper[4728]: I1216 15:34:30.359845 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:33 crc kubenswrapper[4728]: I1216 15:34:33.752267 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-htnjf"] Dec 16 15:34:33 crc kubenswrapper[4728]: I1216 15:34:33.752847 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-htnjf" podUID="e3063b2e-3324-46e9-be50-f6e5707d558c" containerName="registry-server" containerID="cri-o://086991843b4acaa3776fcd8b151e67a68e282185e9477a356c4100f199890d97" gracePeriod=2 Dec 16 15:34:34 crc kubenswrapper[4728]: I1216 15:34:34.347927 4728 generic.go:334] "Generic (PLEG): container finished" podID="e3063b2e-3324-46e9-be50-f6e5707d558c" containerID="086991843b4acaa3776fcd8b151e67a68e282185e9477a356c4100f199890d97" exitCode=0 Dec 16 15:34:34 crc kubenswrapper[4728]: I1216 15:34:34.347977 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htnjf" event={"ID":"e3063b2e-3324-46e9-be50-f6e5707d558c","Type":"ContainerDied","Data":"086991843b4acaa3776fcd8b151e67a68e282185e9477a356c4100f199890d97"} Dec 16 15:34:34 crc kubenswrapper[4728]: I1216 15:34:34.822481 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:34 crc kubenswrapper[4728]: I1216 15:34:34.918500 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwrrp\" (UniqueName: \"kubernetes.io/projected/e3063b2e-3324-46e9-be50-f6e5707d558c-kube-api-access-mwrrp\") pod \"e3063b2e-3324-46e9-be50-f6e5707d558c\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " Dec 16 15:34:34 crc kubenswrapper[4728]: I1216 15:34:34.918656 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-catalog-content\") pod \"e3063b2e-3324-46e9-be50-f6e5707d558c\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " Dec 16 15:34:34 crc kubenswrapper[4728]: I1216 15:34:34.918823 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-utilities\") pod \"e3063b2e-3324-46e9-be50-f6e5707d558c\" (UID: \"e3063b2e-3324-46e9-be50-f6e5707d558c\") " Dec 16 15:34:34 crc kubenswrapper[4728]: I1216 15:34:34.922492 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-utilities" (OuterVolumeSpecName: "utilities") pod "e3063b2e-3324-46e9-be50-f6e5707d558c" (UID: "e3063b2e-3324-46e9-be50-f6e5707d558c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:34:34 crc kubenswrapper[4728]: I1216 15:34:34.931686 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3063b2e-3324-46e9-be50-f6e5707d558c-kube-api-access-mwrrp" (OuterVolumeSpecName: "kube-api-access-mwrrp") pod "e3063b2e-3324-46e9-be50-f6e5707d558c" (UID: "e3063b2e-3324-46e9-be50-f6e5707d558c"). InnerVolumeSpecName "kube-api-access-mwrrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.001789 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3063b2e-3324-46e9-be50-f6e5707d558c" (UID: "e3063b2e-3324-46e9-be50-f6e5707d558c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.021335 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwrrp\" (UniqueName: \"kubernetes.io/projected/e3063b2e-3324-46e9-be50-f6e5707d558c-kube-api-access-mwrrp\") on node \"crc\" DevicePath \"\"" Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.021387 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.021426 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3063b2e-3324-46e9-be50-f6e5707d558c-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.368310 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-htnjf" event={"ID":"e3063b2e-3324-46e9-be50-f6e5707d558c","Type":"ContainerDied","Data":"de2616cee914d5e4afae0f47c207a5408bdf60fa0304e1e2f4e7c4a1d6c3d510"} Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.368380 4728 scope.go:117] "RemoveContainer" containerID="086991843b4acaa3776fcd8b151e67a68e282185e9477a356c4100f199890d97" Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.368441 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-htnjf" Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.395739 4728 scope.go:117] "RemoveContainer" containerID="9927ea3fa980d1f69149b6eb6745abf34bf6f87c33898d1cfd3cff78d212d90d" Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.446259 4728 scope.go:117] "RemoveContainer" containerID="6a7b7d7122c42128553f201e94695aa5eafe170ac1106b9ed7e2808b0d1bcf34" Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.460540 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-htnjf"] Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.471032 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-htnjf"] Dec 16 15:34:35 crc kubenswrapper[4728]: I1216 15:34:35.520301 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3063b2e-3324-46e9-be50-f6e5707d558c" path="/var/lib/kubelet/pods/e3063b2e-3324-46e9-be50-f6e5707d558c/volumes" Dec 16 15:35:33 crc kubenswrapper[4728]: I1216 15:35:33.981223 4728 generic.go:334] "Generic (PLEG): container finished" podID="355982cb-601d-4505-926c-8fa80bd4f3b6" containerID="9ca6584ee3636ed25327a42327762f1e113d8fc974daf8f1fd67af268015a98e" exitCode=0 Dec 16 15:35:33 crc kubenswrapper[4728]: I1216 15:35:33.981317 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" event={"ID":"355982cb-601d-4505-926c-8fa80bd4f3b6","Type":"ContainerDied","Data":"9ca6584ee3636ed25327a42327762f1e113d8fc974daf8f1fd67af268015a98e"} Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.462781 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.623203 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-combined-ca-bundle\") pod \"355982cb-601d-4505-926c-8fa80bd4f3b6\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.623334 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-inventory\") pod \"355982cb-601d-4505-926c-8fa80bd4f3b6\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.623377 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-ssh-key\") pod \"355982cb-601d-4505-926c-8fa80bd4f3b6\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.624249 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-secret-0\") pod \"355982cb-601d-4505-926c-8fa80bd4f3b6\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.624393 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kjjx\" (UniqueName: \"kubernetes.io/projected/355982cb-601d-4505-926c-8fa80bd4f3b6-kube-api-access-4kjjx\") pod \"355982cb-601d-4505-926c-8fa80bd4f3b6\" (UID: \"355982cb-601d-4505-926c-8fa80bd4f3b6\") " Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.636777 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "355982cb-601d-4505-926c-8fa80bd4f3b6" (UID: "355982cb-601d-4505-926c-8fa80bd4f3b6"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.640092 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355982cb-601d-4505-926c-8fa80bd4f3b6-kube-api-access-4kjjx" (OuterVolumeSpecName: "kube-api-access-4kjjx") pod "355982cb-601d-4505-926c-8fa80bd4f3b6" (UID: "355982cb-601d-4505-926c-8fa80bd4f3b6"). InnerVolumeSpecName "kube-api-access-4kjjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.656465 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-inventory" (OuterVolumeSpecName: "inventory") pod "355982cb-601d-4505-926c-8fa80bd4f3b6" (UID: "355982cb-601d-4505-926c-8fa80bd4f3b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.687896 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "355982cb-601d-4505-926c-8fa80bd4f3b6" (UID: "355982cb-601d-4505-926c-8fa80bd4f3b6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.695924 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "355982cb-601d-4505-926c-8fa80bd4f3b6" (UID: "355982cb-601d-4505-926c-8fa80bd4f3b6"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.729775 4728 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.729812 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.729842 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.729854 4728 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/355982cb-601d-4505-926c-8fa80bd4f3b6-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:35:35 crc kubenswrapper[4728]: I1216 15:35:35.729867 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kjjx\" (UniqueName: \"kubernetes.io/projected/355982cb-601d-4505-926c-8fa80bd4f3b6-kube-api-access-4kjjx\") on node \"crc\" DevicePath \"\"" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.007371 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" event={"ID":"355982cb-601d-4505-926c-8fa80bd4f3b6","Type":"ContainerDied","Data":"8a60ecde530e3222a1b72bb05a8ab86bdd71bf8af9b04fa427e6ec4ba331758b"} Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.007482 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a60ecde530e3222a1b72bb05a8ab86bdd71bf8af9b04fa427e6ec4ba331758b" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.007555 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-fg5md" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.133051 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn"] Dec 16 15:35:36 crc kubenswrapper[4728]: E1216 15:35:36.133426 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3063b2e-3324-46e9-be50-f6e5707d558c" containerName="extract-content" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.133442 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3063b2e-3324-46e9-be50-f6e5707d558c" containerName="extract-content" Dec 16 15:35:36 crc kubenswrapper[4728]: E1216 15:35:36.133450 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3063b2e-3324-46e9-be50-f6e5707d558c" containerName="extract-utilities" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.133457 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3063b2e-3324-46e9-be50-f6e5707d558c" containerName="extract-utilities" Dec 16 15:35:36 crc kubenswrapper[4728]: E1216 15:35:36.133468 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355982cb-601d-4505-926c-8fa80bd4f3b6" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.133477 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="355982cb-601d-4505-926c-8fa80bd4f3b6" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 16 15:35:36 crc kubenswrapper[4728]: E1216 15:35:36.133488 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3063b2e-3324-46e9-be50-f6e5707d558c" containerName="registry-server" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.133494 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3063b2e-3324-46e9-be50-f6e5707d558c" containerName="registry-server" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.133672 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="355982cb-601d-4505-926c-8fa80bd4f3b6" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.133692 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3063b2e-3324-46e9-be50-f6e5707d558c" containerName="registry-server" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.134261 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.136580 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.136811 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.137033 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.137086 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.137592 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.137667 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.137673 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.160850 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn"] Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.241832 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.242175 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pszq\" (UniqueName: \"kubernetes.io/projected/655f0b26-df18-45a2-a9f9-24df853d48ed-kube-api-access-7pszq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.242211 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.242276 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.242335 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.242353 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.242392 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.242455 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.242528 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.344334 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.344382 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.344438 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.344475 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.344515 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.344549 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.344564 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pszq\" (UniqueName: \"kubernetes.io/projected/655f0b26-df18-45a2-a9f9-24df853d48ed-kube-api-access-7pszq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.344590 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.344623 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.346203 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.349378 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.349479 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.350045 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.354092 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.356665 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.356684 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.356801 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.372355 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pszq\" (UniqueName: \"kubernetes.io/projected/655f0b26-df18-45a2-a9f9-24df853d48ed-kube-api-access-7pszq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-77qmn\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:36 crc kubenswrapper[4728]: I1216 15:35:36.460560 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:35:37 crc kubenswrapper[4728]: I1216 15:35:37.047932 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn"] Dec 16 15:35:38 crc kubenswrapper[4728]: I1216 15:35:38.030490 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" event={"ID":"655f0b26-df18-45a2-a9f9-24df853d48ed","Type":"ContainerStarted","Data":"4dbd810800e89243c9176ed10d047188282e1740be21f433cb9257ed68391f26"} Dec 16 15:35:39 crc kubenswrapper[4728]: I1216 15:35:39.040928 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" event={"ID":"655f0b26-df18-45a2-a9f9-24df853d48ed","Type":"ContainerStarted","Data":"5917fb2c9e0e9456e0802ebcd68ff7dc1892fa41d6757a618a1912b36eab04fd"} Dec 16 15:35:39 crc kubenswrapper[4728]: I1216 15:35:39.066259 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" podStartSLOduration=2.435974266 podStartE2EDuration="3.066231699s" podCreationTimestamp="2025-12-16 15:35:36 +0000 UTC" firstStartedPulling="2025-12-16 15:35:37.084681387 +0000 UTC m=+2317.924860371" lastFinishedPulling="2025-12-16 15:35:37.71493878 +0000 UTC m=+2318.555117804" observedRunningTime="2025-12-16 15:35:39.057236716 +0000 UTC m=+2319.897415740" watchObservedRunningTime="2025-12-16 15:35:39.066231699 +0000 UTC m=+2319.906410723" Dec 16 15:36:08 crc kubenswrapper[4728]: I1216 15:36:08.818520 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:36:08 crc kubenswrapper[4728]: I1216 15:36:08.819091 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:36:38 crc kubenswrapper[4728]: I1216 15:36:38.819086 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:36:38 crc kubenswrapper[4728]: I1216 15:36:38.819905 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:37:08 crc kubenswrapper[4728]: I1216 15:37:08.818277 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:37:08 crc kubenswrapper[4728]: I1216 15:37:08.819194 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:37:08 crc kubenswrapper[4728]: I1216 15:37:08.819278 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:37:08 crc kubenswrapper[4728]: I1216 15:37:08.820901 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:37:08 crc kubenswrapper[4728]: I1216 15:37:08.820991 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" gracePeriod=600 Dec 16 15:37:09 crc kubenswrapper[4728]: E1216 15:37:09.453772 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:37:09 crc kubenswrapper[4728]: I1216 15:37:09.923582 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" exitCode=0 Dec 16 15:37:09 crc kubenswrapper[4728]: I1216 15:37:09.923633 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5"} Dec 16 15:37:09 crc kubenswrapper[4728]: I1216 15:37:09.923943 4728 scope.go:117] "RemoveContainer" containerID="56b7ccdd90eefd83a20f0141379b272c6901cc5c468dfd8cec9353a9a7e13a4b" Dec 16 15:37:09 crc kubenswrapper[4728]: I1216 15:37:09.924770 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:37:09 crc kubenswrapper[4728]: E1216 15:37:09.925144 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:37:23 crc kubenswrapper[4728]: I1216 15:37:23.506198 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:37:23 crc kubenswrapper[4728]: E1216 15:37:23.507001 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:37:35 crc kubenswrapper[4728]: I1216 15:37:35.507225 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:37:35 crc kubenswrapper[4728]: E1216 15:37:35.508846 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:37:49 crc kubenswrapper[4728]: I1216 15:37:49.517124 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:37:49 crc kubenswrapper[4728]: E1216 15:37:49.518123 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:38:02 crc kubenswrapper[4728]: I1216 15:38:02.506723 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:38:02 crc kubenswrapper[4728]: E1216 15:38:02.507931 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:38:14 crc kubenswrapper[4728]: I1216 15:38:14.507458 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:38:14 crc kubenswrapper[4728]: E1216 15:38:14.508654 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:38:28 crc kubenswrapper[4728]: I1216 15:38:28.506727 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:38:28 crc kubenswrapper[4728]: E1216 15:38:28.507957 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:38:39 crc kubenswrapper[4728]: I1216 15:38:39.761937 4728 generic.go:334] "Generic (PLEG): container finished" podID="655f0b26-df18-45a2-a9f9-24df853d48ed" containerID="5917fb2c9e0e9456e0802ebcd68ff7dc1892fa41d6757a618a1912b36eab04fd" exitCode=0 Dec 16 15:38:39 crc kubenswrapper[4728]: I1216 15:38:39.762066 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" event={"ID":"655f0b26-df18-45a2-a9f9-24df853d48ed","Type":"ContainerDied","Data":"5917fb2c9e0e9456e0802ebcd68ff7dc1892fa41d6757a618a1912b36eab04fd"} Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.205667 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.292747 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-combined-ca-bundle\") pod \"655f0b26-df18-45a2-a9f9-24df853d48ed\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.292917 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-1\") pod \"655f0b26-df18-45a2-a9f9-24df853d48ed\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.292990 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-1\") pod \"655f0b26-df18-45a2-a9f9-24df853d48ed\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.293021 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pszq\" (UniqueName: \"kubernetes.io/projected/655f0b26-df18-45a2-a9f9-24df853d48ed-kube-api-access-7pszq\") pod \"655f0b26-df18-45a2-a9f9-24df853d48ed\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.293045 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-inventory\") pod \"655f0b26-df18-45a2-a9f9-24df853d48ed\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.293074 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-ssh-key\") pod \"655f0b26-df18-45a2-a9f9-24df853d48ed\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.293097 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-0\") pod \"655f0b26-df18-45a2-a9f9-24df853d48ed\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.293117 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-extra-config-0\") pod \"655f0b26-df18-45a2-a9f9-24df853d48ed\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.293146 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-0\") pod \"655f0b26-df18-45a2-a9f9-24df853d48ed\" (UID: \"655f0b26-df18-45a2-a9f9-24df853d48ed\") " Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.299824 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "655f0b26-df18-45a2-a9f9-24df853d48ed" (UID: "655f0b26-df18-45a2-a9f9-24df853d48ed"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.300138 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/655f0b26-df18-45a2-a9f9-24df853d48ed-kube-api-access-7pszq" (OuterVolumeSpecName: "kube-api-access-7pszq") pod "655f0b26-df18-45a2-a9f9-24df853d48ed" (UID: "655f0b26-df18-45a2-a9f9-24df853d48ed"). InnerVolumeSpecName "kube-api-access-7pszq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.322742 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "655f0b26-df18-45a2-a9f9-24df853d48ed" (UID: "655f0b26-df18-45a2-a9f9-24df853d48ed"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.327920 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "655f0b26-df18-45a2-a9f9-24df853d48ed" (UID: "655f0b26-df18-45a2-a9f9-24df853d48ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.329718 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "655f0b26-df18-45a2-a9f9-24df853d48ed" (UID: "655f0b26-df18-45a2-a9f9-24df853d48ed"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.330262 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-inventory" (OuterVolumeSpecName: "inventory") pod "655f0b26-df18-45a2-a9f9-24df853d48ed" (UID: "655f0b26-df18-45a2-a9f9-24df853d48ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.333121 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "655f0b26-df18-45a2-a9f9-24df853d48ed" (UID: "655f0b26-df18-45a2-a9f9-24df853d48ed"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.333712 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "655f0b26-df18-45a2-a9f9-24df853d48ed" (UID: "655f0b26-df18-45a2-a9f9-24df853d48ed"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.338415 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "655f0b26-df18-45a2-a9f9-24df853d48ed" (UID: "655f0b26-df18-45a2-a9f9-24df853d48ed"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.397079 4728 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.397112 4728 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.397125 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pszq\" (UniqueName: \"kubernetes.io/projected/655f0b26-df18-45a2-a9f9-24df853d48ed-kube-api-access-7pszq\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.397138 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.397151 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.397160 4728 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.397174 4728 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.397185 4728 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.397193 4728 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655f0b26-df18-45a2-a9f9-24df853d48ed-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:38:41 crc kubenswrapper[4728]: E1216 15:38:41.715937 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod655f0b26_df18_45a2_a9f9_24df853d48ed.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod655f0b26_df18_45a2_a9f9_24df853d48ed.slice/crio-4dbd810800e89243c9176ed10d047188282e1740be21f433cb9257ed68391f26\": RecentStats: unable to find data in memory cache]" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.778991 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" event={"ID":"655f0b26-df18-45a2-a9f9-24df853d48ed","Type":"ContainerDied","Data":"4dbd810800e89243c9176ed10d047188282e1740be21f433cb9257ed68391f26"} Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.779066 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dbd810800e89243c9176ed10d047188282e1740be21f433cb9257ed68391f26" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.779122 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-77qmn" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.901057 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt"] Dec 16 15:38:41 crc kubenswrapper[4728]: E1216 15:38:41.901750 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655f0b26-df18-45a2-a9f9-24df853d48ed" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.901771 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="655f0b26-df18-45a2-a9f9-24df853d48ed" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.901942 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="655f0b26-df18-45a2-a9f9-24df853d48ed" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.902580 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.904397 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.908175 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.908389 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.908705 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.908847 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-64j2l" Dec 16 15:38:41 crc kubenswrapper[4728]: I1216 15:38:41.913265 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt"] Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.011526 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.011659 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.011727 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.011776 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.011818 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7tc2\" (UniqueName: \"kubernetes.io/projected/e3adc58c-a09d-4e32-bd59-10d32f1866ca-kube-api-access-r7tc2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.011862 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.011889 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.113369 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.113531 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.113621 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7tc2\" (UniqueName: \"kubernetes.io/projected/e3adc58c-a09d-4e32-bd59-10d32f1866ca-kube-api-access-r7tc2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.113694 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.113749 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.113869 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.113988 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.118938 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.120593 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.122848 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.123985 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.125991 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.126394 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.151664 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7tc2\" (UniqueName: \"kubernetes.io/projected/e3adc58c-a09d-4e32-bd59-10d32f1866ca-kube-api-access-r7tc2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.220526 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.506776 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:38:42 crc kubenswrapper[4728]: E1216 15:38:42.507087 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.745747 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt"] Dec 16 15:38:42 crc kubenswrapper[4728]: I1216 15:38:42.790295 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" event={"ID":"e3adc58c-a09d-4e32-bd59-10d32f1866ca","Type":"ContainerStarted","Data":"f5f740e97966e40f01727eef79f6bac67c2df011ac6ec295ffef29757dff8f49"} Dec 16 15:38:44 crc kubenswrapper[4728]: I1216 15:38:44.816892 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" event={"ID":"e3adc58c-a09d-4e32-bd59-10d32f1866ca","Type":"ContainerStarted","Data":"863cb47387eb11472b7ab119fd47edbb519668867a84b823a7765b7df9346337"} Dec 16 15:38:44 crc kubenswrapper[4728]: I1216 15:38:44.843708 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" podStartSLOduration=2.472056862 podStartE2EDuration="3.843681095s" podCreationTimestamp="2025-12-16 15:38:41 +0000 UTC" firstStartedPulling="2025-12-16 15:38:42.755887679 +0000 UTC m=+2503.596066683" lastFinishedPulling="2025-12-16 15:38:44.127511932 +0000 UTC m=+2504.967690916" observedRunningTime="2025-12-16 15:38:44.83849995 +0000 UTC m=+2505.678678934" watchObservedRunningTime="2025-12-16 15:38:44.843681095 +0000 UTC m=+2505.683860079" Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.596298 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p6f8q"] Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.602533 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.649119 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6f8q"] Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.760146 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-catalog-content\") pod \"community-operators-p6f8q\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.760207 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-utilities\") pod \"community-operators-p6f8q\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.760237 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mv5h\" (UniqueName: \"kubernetes.io/projected/58d28616-3451-49a9-aff3-3415c6e5b5dd-kube-api-access-8mv5h\") pod \"community-operators-p6f8q\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.862246 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-catalog-content\") pod \"community-operators-p6f8q\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.862561 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-utilities\") pod \"community-operators-p6f8q\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.862693 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mv5h\" (UniqueName: \"kubernetes.io/projected/58d28616-3451-49a9-aff3-3415c6e5b5dd-kube-api-access-8mv5h\") pod \"community-operators-p6f8q\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.862816 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-catalog-content\") pod \"community-operators-p6f8q\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.863101 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-utilities\") pod \"community-operators-p6f8q\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.881883 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mv5h\" (UniqueName: \"kubernetes.io/projected/58d28616-3451-49a9-aff3-3415c6e5b5dd-kube-api-access-8mv5h\") pod \"community-operators-p6f8q\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:49 crc kubenswrapper[4728]: I1216 15:38:49.960493 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:50 crc kubenswrapper[4728]: I1216 15:38:50.496620 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p6f8q"] Dec 16 15:38:50 crc kubenswrapper[4728]: I1216 15:38:50.862874 4728 generic.go:334] "Generic (PLEG): container finished" podID="58d28616-3451-49a9-aff3-3415c6e5b5dd" containerID="61b7b726e1aa89708611372ad8b409f6bf2eb5d0ae5a6ab816b2718b676b7b16" exitCode=0 Dec 16 15:38:50 crc kubenswrapper[4728]: I1216 15:38:50.862931 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6f8q" event={"ID":"58d28616-3451-49a9-aff3-3415c6e5b5dd","Type":"ContainerDied","Data":"61b7b726e1aa89708611372ad8b409f6bf2eb5d0ae5a6ab816b2718b676b7b16"} Dec 16 15:38:50 crc kubenswrapper[4728]: I1216 15:38:50.862963 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6f8q" event={"ID":"58d28616-3451-49a9-aff3-3415c6e5b5dd","Type":"ContainerStarted","Data":"7bb82e1c776f1371ed2eb7c88cb64462cc17de58b57e08dd7543c2e321eb4395"} Dec 16 15:38:52 crc kubenswrapper[4728]: I1216 15:38:52.884583 4728 generic.go:334] "Generic (PLEG): container finished" podID="58d28616-3451-49a9-aff3-3415c6e5b5dd" containerID="af0f8ff99eb8348cdda8b747c96a080eecc2ba7a6c53cf06076535c83e764cbf" exitCode=0 Dec 16 15:38:52 crc kubenswrapper[4728]: I1216 15:38:52.884823 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6f8q" event={"ID":"58d28616-3451-49a9-aff3-3415c6e5b5dd","Type":"ContainerDied","Data":"af0f8ff99eb8348cdda8b747c96a080eecc2ba7a6c53cf06076535c83e764cbf"} Dec 16 15:38:54 crc kubenswrapper[4728]: I1216 15:38:54.915074 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6f8q" event={"ID":"58d28616-3451-49a9-aff3-3415c6e5b5dd","Type":"ContainerStarted","Data":"a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8"} Dec 16 15:38:54 crc kubenswrapper[4728]: I1216 15:38:54.960435 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p6f8q" podStartSLOduration=2.873934845 podStartE2EDuration="5.960391242s" podCreationTimestamp="2025-12-16 15:38:49 +0000 UTC" firstStartedPulling="2025-12-16 15:38:50.864873181 +0000 UTC m=+2511.705052165" lastFinishedPulling="2025-12-16 15:38:53.951329568 +0000 UTC m=+2514.791508562" observedRunningTime="2025-12-16 15:38:54.955098343 +0000 UTC m=+2515.795277337" watchObservedRunningTime="2025-12-16 15:38:54.960391242 +0000 UTC m=+2515.800570246" Dec 16 15:38:56 crc kubenswrapper[4728]: I1216 15:38:56.506176 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:38:56 crc kubenswrapper[4728]: E1216 15:38:56.506875 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:38:59 crc kubenswrapper[4728]: I1216 15:38:59.961227 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:38:59 crc kubenswrapper[4728]: I1216 15:38:59.961600 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:39:00 crc kubenswrapper[4728]: I1216 15:39:00.010086 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:39:00 crc kubenswrapper[4728]: I1216 15:39:00.055820 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:39:00 crc kubenswrapper[4728]: I1216 15:39:00.241475 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6f8q"] Dec 16 15:39:01 crc kubenswrapper[4728]: I1216 15:39:01.999126 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p6f8q" podUID="58d28616-3451-49a9-aff3-3415c6e5b5dd" containerName="registry-server" containerID="cri-o://a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8" gracePeriod=2 Dec 16 15:39:02 crc kubenswrapper[4728]: E1216 15:39:02.209424 4728 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d28616_3451_49a9_aff3_3415c6e5b5dd.slice/crio-a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d28616_3451_49a9_aff3_3415c6e5b5dd.slice/crio-conmon-a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8.scope\": RecentStats: unable to find data in memory cache]" Dec 16 15:39:02 crc kubenswrapper[4728]: I1216 15:39:02.451025 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:39:02 crc kubenswrapper[4728]: I1216 15:39:02.563127 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mv5h\" (UniqueName: \"kubernetes.io/projected/58d28616-3451-49a9-aff3-3415c6e5b5dd-kube-api-access-8mv5h\") pod \"58d28616-3451-49a9-aff3-3415c6e5b5dd\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " Dec 16 15:39:02 crc kubenswrapper[4728]: I1216 15:39:02.563271 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-utilities\") pod \"58d28616-3451-49a9-aff3-3415c6e5b5dd\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " Dec 16 15:39:02 crc kubenswrapper[4728]: I1216 15:39:02.563317 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-catalog-content\") pod \"58d28616-3451-49a9-aff3-3415c6e5b5dd\" (UID: \"58d28616-3451-49a9-aff3-3415c6e5b5dd\") " Dec 16 15:39:02 crc kubenswrapper[4728]: I1216 15:39:02.564239 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-utilities" (OuterVolumeSpecName: "utilities") pod "58d28616-3451-49a9-aff3-3415c6e5b5dd" (UID: "58d28616-3451-49a9-aff3-3415c6e5b5dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:39:02 crc kubenswrapper[4728]: I1216 15:39:02.569570 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d28616-3451-49a9-aff3-3415c6e5b5dd-kube-api-access-8mv5h" (OuterVolumeSpecName: "kube-api-access-8mv5h") pod "58d28616-3451-49a9-aff3-3415c6e5b5dd" (UID: "58d28616-3451-49a9-aff3-3415c6e5b5dd"). InnerVolumeSpecName "kube-api-access-8mv5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:39:02 crc kubenswrapper[4728]: I1216 15:39:02.622844 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58d28616-3451-49a9-aff3-3415c6e5b5dd" (UID: "58d28616-3451-49a9-aff3-3415c6e5b5dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:39:02 crc kubenswrapper[4728]: I1216 15:39:02.665679 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:39:02 crc kubenswrapper[4728]: I1216 15:39:02.665709 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d28616-3451-49a9-aff3-3415c6e5b5dd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:39:02 crc kubenswrapper[4728]: I1216 15:39:02.665720 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mv5h\" (UniqueName: \"kubernetes.io/projected/58d28616-3451-49a9-aff3-3415c6e5b5dd-kube-api-access-8mv5h\") on node \"crc\" DevicePath \"\"" Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.009908 4728 generic.go:334] "Generic (PLEG): container finished" podID="58d28616-3451-49a9-aff3-3415c6e5b5dd" containerID="a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8" exitCode=0 Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.009961 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6f8q" event={"ID":"58d28616-3451-49a9-aff3-3415c6e5b5dd","Type":"ContainerDied","Data":"a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8"} Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.009993 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p6f8q" event={"ID":"58d28616-3451-49a9-aff3-3415c6e5b5dd","Type":"ContainerDied","Data":"7bb82e1c776f1371ed2eb7c88cb64462cc17de58b57e08dd7543c2e321eb4395"} Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.010013 4728 scope.go:117] "RemoveContainer" containerID="a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8" Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.010783 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p6f8q" Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.029662 4728 scope.go:117] "RemoveContainer" containerID="af0f8ff99eb8348cdda8b747c96a080eecc2ba7a6c53cf06076535c83e764cbf" Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.043680 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p6f8q"] Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.051646 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p6f8q"] Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.073645 4728 scope.go:117] "RemoveContainer" containerID="61b7b726e1aa89708611372ad8b409f6bf2eb5d0ae5a6ab816b2718b676b7b16" Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.092763 4728 scope.go:117] "RemoveContainer" containerID="a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8" Dec 16 15:39:03 crc kubenswrapper[4728]: E1216 15:39:03.093279 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8\": container with ID starting with a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8 not found: ID does not exist" containerID="a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8" Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.093321 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8"} err="failed to get container status \"a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8\": rpc error: code = NotFound desc = could not find container \"a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8\": container with ID starting with a30119c797a175c14e68dbd67937c058b9d7cb1812b3e515b4c2f5af3aeb35a8 not found: ID does not exist" Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.093358 4728 scope.go:117] "RemoveContainer" containerID="af0f8ff99eb8348cdda8b747c96a080eecc2ba7a6c53cf06076535c83e764cbf" Dec 16 15:39:03 crc kubenswrapper[4728]: E1216 15:39:03.093718 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0f8ff99eb8348cdda8b747c96a080eecc2ba7a6c53cf06076535c83e764cbf\": container with ID starting with af0f8ff99eb8348cdda8b747c96a080eecc2ba7a6c53cf06076535c83e764cbf not found: ID does not exist" containerID="af0f8ff99eb8348cdda8b747c96a080eecc2ba7a6c53cf06076535c83e764cbf" Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.093752 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0f8ff99eb8348cdda8b747c96a080eecc2ba7a6c53cf06076535c83e764cbf"} err="failed to get container status \"af0f8ff99eb8348cdda8b747c96a080eecc2ba7a6c53cf06076535c83e764cbf\": rpc error: code = NotFound desc = could not find container \"af0f8ff99eb8348cdda8b747c96a080eecc2ba7a6c53cf06076535c83e764cbf\": container with ID starting with af0f8ff99eb8348cdda8b747c96a080eecc2ba7a6c53cf06076535c83e764cbf not found: ID does not exist" Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.093772 4728 scope.go:117] "RemoveContainer" containerID="61b7b726e1aa89708611372ad8b409f6bf2eb5d0ae5a6ab816b2718b676b7b16" Dec 16 15:39:03 crc kubenswrapper[4728]: E1216 15:39:03.094042 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b7b726e1aa89708611372ad8b409f6bf2eb5d0ae5a6ab816b2718b676b7b16\": container with ID starting with 61b7b726e1aa89708611372ad8b409f6bf2eb5d0ae5a6ab816b2718b676b7b16 not found: ID does not exist" containerID="61b7b726e1aa89708611372ad8b409f6bf2eb5d0ae5a6ab816b2718b676b7b16" Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.094070 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b7b726e1aa89708611372ad8b409f6bf2eb5d0ae5a6ab816b2718b676b7b16"} err="failed to get container status \"61b7b726e1aa89708611372ad8b409f6bf2eb5d0ae5a6ab816b2718b676b7b16\": rpc error: code = NotFound desc = could not find container \"61b7b726e1aa89708611372ad8b409f6bf2eb5d0ae5a6ab816b2718b676b7b16\": container with ID starting with 61b7b726e1aa89708611372ad8b409f6bf2eb5d0ae5a6ab816b2718b676b7b16 not found: ID does not exist" Dec 16 15:39:03 crc kubenswrapper[4728]: I1216 15:39:03.515835 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d28616-3451-49a9-aff3-3415c6e5b5dd" path="/var/lib/kubelet/pods/58d28616-3451-49a9-aff3-3415c6e5b5dd/volumes" Dec 16 15:39:11 crc kubenswrapper[4728]: I1216 15:39:11.507115 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:39:11 crc kubenswrapper[4728]: E1216 15:39:11.509571 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:39:23 crc kubenswrapper[4728]: I1216 15:39:23.509476 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:39:23 crc kubenswrapper[4728]: E1216 15:39:23.510175 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:39:35 crc kubenswrapper[4728]: I1216 15:39:35.507199 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:39:35 crc kubenswrapper[4728]: E1216 15:39:35.508593 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:39:47 crc kubenswrapper[4728]: I1216 15:39:47.507542 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:39:47 crc kubenswrapper[4728]: E1216 15:39:47.508467 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:39:58 crc kubenswrapper[4728]: I1216 15:39:58.508801 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:39:58 crc kubenswrapper[4728]: E1216 15:39:58.510099 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:40:11 crc kubenswrapper[4728]: I1216 15:40:11.507144 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:40:11 crc kubenswrapper[4728]: E1216 15:40:11.507954 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:40:23 crc kubenswrapper[4728]: I1216 15:40:23.506875 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:40:23 crc kubenswrapper[4728]: E1216 15:40:23.507772 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:40:37 crc kubenswrapper[4728]: I1216 15:40:37.506606 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:40:37 crc kubenswrapper[4728]: E1216 15:40:37.509713 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:40:52 crc kubenswrapper[4728]: I1216 15:40:52.506817 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:40:52 crc kubenswrapper[4728]: E1216 15:40:52.507662 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:41:07 crc kubenswrapper[4728]: I1216 15:41:07.506113 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:41:07 crc kubenswrapper[4728]: E1216 15:41:07.506891 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:41:20 crc kubenswrapper[4728]: I1216 15:41:20.506477 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:41:20 crc kubenswrapper[4728]: E1216 15:41:20.507429 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:41:31 crc kubenswrapper[4728]: I1216 15:41:31.506733 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:41:31 crc kubenswrapper[4728]: E1216 15:41:31.507838 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:41:46 crc kubenswrapper[4728]: I1216 15:41:46.507313 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:41:46 crc kubenswrapper[4728]: E1216 15:41:46.508189 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:41:58 crc kubenswrapper[4728]: I1216 15:41:58.506950 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:41:58 crc kubenswrapper[4728]: E1216 15:41:58.507705 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.069893 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cgn94"] Dec 16 15:42:01 crc kubenswrapper[4728]: E1216 15:42:01.071138 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d28616-3451-49a9-aff3-3415c6e5b5dd" containerName="extract-content" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.071160 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d28616-3451-49a9-aff3-3415c6e5b5dd" containerName="extract-content" Dec 16 15:42:01 crc kubenswrapper[4728]: E1216 15:42:01.071192 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d28616-3451-49a9-aff3-3415c6e5b5dd" containerName="extract-utilities" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.071206 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d28616-3451-49a9-aff3-3415c6e5b5dd" containerName="extract-utilities" Dec 16 15:42:01 crc kubenswrapper[4728]: E1216 15:42:01.071235 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d28616-3451-49a9-aff3-3415c6e5b5dd" containerName="registry-server" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.071248 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d28616-3451-49a9-aff3-3415c6e5b5dd" containerName="registry-server" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.071629 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d28616-3451-49a9-aff3-3415c6e5b5dd" containerName="registry-server" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.074378 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.102906 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgn94"] Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.252528 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-catalog-content\") pod \"redhat-marketplace-cgn94\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.252604 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng7j8\" (UniqueName: \"kubernetes.io/projected/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-kube-api-access-ng7j8\") pod \"redhat-marketplace-cgn94\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.252686 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-utilities\") pod \"redhat-marketplace-cgn94\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.355042 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-utilities\") pod \"redhat-marketplace-cgn94\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.355196 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-catalog-content\") pod \"redhat-marketplace-cgn94\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.355226 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng7j8\" (UniqueName: \"kubernetes.io/projected/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-kube-api-access-ng7j8\") pod \"redhat-marketplace-cgn94\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.355555 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-utilities\") pod \"redhat-marketplace-cgn94\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.356162 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-catalog-content\") pod \"redhat-marketplace-cgn94\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.380953 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng7j8\" (UniqueName: \"kubernetes.io/projected/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-kube-api-access-ng7j8\") pod \"redhat-marketplace-cgn94\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.399958 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.769648 4728 generic.go:334] "Generic (PLEG): container finished" podID="e3adc58c-a09d-4e32-bd59-10d32f1866ca" containerID="863cb47387eb11472b7ab119fd47edbb519668867a84b823a7765b7df9346337" exitCode=0 Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.769745 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" event={"ID":"e3adc58c-a09d-4e32-bd59-10d32f1866ca","Type":"ContainerDied","Data":"863cb47387eb11472b7ab119fd47edbb519668867a84b823a7765b7df9346337"} Dec 16 15:42:01 crc kubenswrapper[4728]: I1216 15:42:01.863620 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgn94"] Dec 16 15:42:02 crc kubenswrapper[4728]: I1216 15:42:02.781928 4728 generic.go:334] "Generic (PLEG): container finished" podID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" containerID="172834060f2cec13f59a4bb66978967444939b4aa8dd49fc0c6e0d89974b815e" exitCode=0 Dec 16 15:42:02 crc kubenswrapper[4728]: I1216 15:42:02.782307 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgn94" event={"ID":"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983","Type":"ContainerDied","Data":"172834060f2cec13f59a4bb66978967444939b4aa8dd49fc0c6e0d89974b815e"} Dec 16 15:42:02 crc kubenswrapper[4728]: I1216 15:42:02.783314 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgn94" event={"ID":"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983","Type":"ContainerStarted","Data":"67a5cfe900b7a0cc44d25cb8a7b188116f21411ebf9f81f9a9ef062f43b8f6a9"} Dec 16 15:42:02 crc kubenswrapper[4728]: I1216 15:42:02.785910 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.229778 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.391785 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-0\") pod \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.391913 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-1\") pod \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.391942 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ssh-key\") pod \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.391977 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-telemetry-combined-ca-bundle\") pod \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.392002 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-inventory\") pod \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.392025 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-2\") pod \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.392690 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7tc2\" (UniqueName: \"kubernetes.io/projected/e3adc58c-a09d-4e32-bd59-10d32f1866ca-kube-api-access-r7tc2\") pod \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\" (UID: \"e3adc58c-a09d-4e32-bd59-10d32f1866ca\") " Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.398609 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3adc58c-a09d-4e32-bd59-10d32f1866ca-kube-api-access-r7tc2" (OuterVolumeSpecName: "kube-api-access-r7tc2") pod "e3adc58c-a09d-4e32-bd59-10d32f1866ca" (UID: "e3adc58c-a09d-4e32-bd59-10d32f1866ca"). InnerVolumeSpecName "kube-api-access-r7tc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.399386 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e3adc58c-a09d-4e32-bd59-10d32f1866ca" (UID: "e3adc58c-a09d-4e32-bd59-10d32f1866ca"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.420709 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-inventory" (OuterVolumeSpecName: "inventory") pod "e3adc58c-a09d-4e32-bd59-10d32f1866ca" (UID: "e3adc58c-a09d-4e32-bd59-10d32f1866ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.422262 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "e3adc58c-a09d-4e32-bd59-10d32f1866ca" (UID: "e3adc58c-a09d-4e32-bd59-10d32f1866ca"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.425856 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "e3adc58c-a09d-4e32-bd59-10d32f1866ca" (UID: "e3adc58c-a09d-4e32-bd59-10d32f1866ca"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.428264 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3adc58c-a09d-4e32-bd59-10d32f1866ca" (UID: "e3adc58c-a09d-4e32-bd59-10d32f1866ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.430529 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "e3adc58c-a09d-4e32-bd59-10d32f1866ca" (UID: "e3adc58c-a09d-4e32-bd59-10d32f1866ca"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.494662 4728 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.494708 4728 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.494718 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.494730 4728 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.494742 4728 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-inventory\") on node \"crc\" DevicePath \"\"" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.494751 4728 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e3adc58c-a09d-4e32-bd59-10d32f1866ca-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.494760 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7tc2\" (UniqueName: \"kubernetes.io/projected/e3adc58c-a09d-4e32-bd59-10d32f1866ca-kube-api-access-r7tc2\") on node \"crc\" DevicePath \"\"" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.793562 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" event={"ID":"e3adc58c-a09d-4e32-bd59-10d32f1866ca","Type":"ContainerDied","Data":"f5f740e97966e40f01727eef79f6bac67c2df011ac6ec295ffef29757dff8f49"} Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.793866 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5f740e97966e40f01727eef79f6bac67c2df011ac6ec295ffef29757dff8f49" Dec 16 15:42:03 crc kubenswrapper[4728]: I1216 15:42:03.793604 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt" Dec 16 15:42:04 crc kubenswrapper[4728]: I1216 15:42:04.807094 4728 generic.go:334] "Generic (PLEG): container finished" podID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" containerID="980d800cf8b62d12096efbf6ed14aed7f800aca32368b12e89279a009ffd7e1f" exitCode=0 Dec 16 15:42:04 crc kubenswrapper[4728]: I1216 15:42:04.807647 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgn94" event={"ID":"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983","Type":"ContainerDied","Data":"980d800cf8b62d12096efbf6ed14aed7f800aca32368b12e89279a009ffd7e1f"} Dec 16 15:42:06 crc kubenswrapper[4728]: I1216 15:42:06.824452 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgn94" event={"ID":"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983","Type":"ContainerStarted","Data":"c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d"} Dec 16 15:42:06 crc kubenswrapper[4728]: I1216 15:42:06.844720 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cgn94" podStartSLOduration=2.974186376 podStartE2EDuration="5.84470355s" podCreationTimestamp="2025-12-16 15:42:01 +0000 UTC" firstStartedPulling="2025-12-16 15:42:02.785646351 +0000 UTC m=+2703.625825345" lastFinishedPulling="2025-12-16 15:42:05.656163525 +0000 UTC m=+2706.496342519" observedRunningTime="2025-12-16 15:42:06.841476036 +0000 UTC m=+2707.681655020" watchObservedRunningTime="2025-12-16 15:42:06.84470355 +0000 UTC m=+2707.684882534" Dec 16 15:42:11 crc kubenswrapper[4728]: I1216 15:42:11.400368 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:11 crc kubenswrapper[4728]: I1216 15:42:11.400927 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:11 crc kubenswrapper[4728]: I1216 15:42:11.454915 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:11 crc kubenswrapper[4728]: I1216 15:42:11.507208 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:42:11 crc kubenswrapper[4728]: I1216 15:42:11.875585 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"8fc32d05477689ff87ff50e703467d8f097d88b4dace14a4b867ef90c3fad142"} Dec 16 15:42:11 crc kubenswrapper[4728]: I1216 15:42:11.936498 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:12 crc kubenswrapper[4728]: I1216 15:42:12.004690 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgn94"] Dec 16 15:42:13 crc kubenswrapper[4728]: I1216 15:42:13.897130 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cgn94" podUID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" containerName="registry-server" containerID="cri-o://c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d" gracePeriod=2 Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.338788 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.449079 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-utilities\") pod \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.449227 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-catalog-content\") pod \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.449522 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng7j8\" (UniqueName: \"kubernetes.io/projected/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-kube-api-access-ng7j8\") pod \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\" (UID: \"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983\") " Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.450023 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-utilities" (OuterVolumeSpecName: "utilities") pod "d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" (UID: "d92dd2fc-9aa5-4d95-89bb-0e589a4fc983"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.463286 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-kube-api-access-ng7j8" (OuterVolumeSpecName: "kube-api-access-ng7j8") pod "d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" (UID: "d92dd2fc-9aa5-4d95-89bb-0e589a4fc983"). InnerVolumeSpecName "kube-api-access-ng7j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.483034 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" (UID: "d92dd2fc-9aa5-4d95-89bb-0e589a4fc983"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:42:14 crc kubenswrapper[4728]: E1216 15:42:14.540341 4728 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.210:32966->38.102.83.210:34353: write tcp 38.102.83.210:32966->38.102.83.210:34353: write: connection reset by peer Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.552163 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng7j8\" (UniqueName: \"kubernetes.io/projected/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-kube-api-access-ng7j8\") on node \"crc\" DevicePath \"\"" Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.552215 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.552228 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.908619 4728 generic.go:334] "Generic (PLEG): container finished" podID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" containerID="c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d" exitCode=0 Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.908678 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgn94" event={"ID":"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983","Type":"ContainerDied","Data":"c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d"} Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.908712 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgn94" event={"ID":"d92dd2fc-9aa5-4d95-89bb-0e589a4fc983","Type":"ContainerDied","Data":"67a5cfe900b7a0cc44d25cb8a7b188116f21411ebf9f81f9a9ef062f43b8f6a9"} Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.908736 4728 scope.go:117] "RemoveContainer" containerID="c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d" Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.908903 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgn94" Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.954435 4728 scope.go:117] "RemoveContainer" containerID="980d800cf8b62d12096efbf6ed14aed7f800aca32368b12e89279a009ffd7e1f" Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.955910 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgn94"] Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.968900 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgn94"] Dec 16 15:42:14 crc kubenswrapper[4728]: I1216 15:42:14.975802 4728 scope.go:117] "RemoveContainer" containerID="172834060f2cec13f59a4bb66978967444939b4aa8dd49fc0c6e0d89974b815e" Dec 16 15:42:15 crc kubenswrapper[4728]: I1216 15:42:15.044310 4728 scope.go:117] "RemoveContainer" containerID="c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d" Dec 16 15:42:15 crc kubenswrapper[4728]: E1216 15:42:15.044710 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d\": container with ID starting with c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d not found: ID does not exist" containerID="c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d" Dec 16 15:42:15 crc kubenswrapper[4728]: I1216 15:42:15.044753 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d"} err="failed to get container status \"c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d\": rpc error: code = NotFound desc = could not find container \"c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d\": container with ID starting with c80b0937404e2400edd34a53309e3c49549af6d43da092daf2f42686fb99973d not found: ID does not exist" Dec 16 15:42:15 crc kubenswrapper[4728]: I1216 15:42:15.044786 4728 scope.go:117] "RemoveContainer" containerID="980d800cf8b62d12096efbf6ed14aed7f800aca32368b12e89279a009ffd7e1f" Dec 16 15:42:15 crc kubenswrapper[4728]: E1216 15:42:15.044994 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"980d800cf8b62d12096efbf6ed14aed7f800aca32368b12e89279a009ffd7e1f\": container with ID starting with 980d800cf8b62d12096efbf6ed14aed7f800aca32368b12e89279a009ffd7e1f not found: ID does not exist" containerID="980d800cf8b62d12096efbf6ed14aed7f800aca32368b12e89279a009ffd7e1f" Dec 16 15:42:15 crc kubenswrapper[4728]: I1216 15:42:15.045012 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"980d800cf8b62d12096efbf6ed14aed7f800aca32368b12e89279a009ffd7e1f"} err="failed to get container status \"980d800cf8b62d12096efbf6ed14aed7f800aca32368b12e89279a009ffd7e1f\": rpc error: code = NotFound desc = could not find container \"980d800cf8b62d12096efbf6ed14aed7f800aca32368b12e89279a009ffd7e1f\": container with ID starting with 980d800cf8b62d12096efbf6ed14aed7f800aca32368b12e89279a009ffd7e1f not found: ID does not exist" Dec 16 15:42:15 crc kubenswrapper[4728]: I1216 15:42:15.045026 4728 scope.go:117] "RemoveContainer" containerID="172834060f2cec13f59a4bb66978967444939b4aa8dd49fc0c6e0d89974b815e" Dec 16 15:42:15 crc kubenswrapper[4728]: E1216 15:42:15.045431 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"172834060f2cec13f59a4bb66978967444939b4aa8dd49fc0c6e0d89974b815e\": container with ID starting with 172834060f2cec13f59a4bb66978967444939b4aa8dd49fc0c6e0d89974b815e not found: ID does not exist" containerID="172834060f2cec13f59a4bb66978967444939b4aa8dd49fc0c6e0d89974b815e" Dec 16 15:42:15 crc kubenswrapper[4728]: I1216 15:42:15.045467 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"172834060f2cec13f59a4bb66978967444939b4aa8dd49fc0c6e0d89974b815e"} err="failed to get container status \"172834060f2cec13f59a4bb66978967444939b4aa8dd49fc0c6e0d89974b815e\": rpc error: code = NotFound desc = could not find container \"172834060f2cec13f59a4bb66978967444939b4aa8dd49fc0c6e0d89974b815e\": container with ID starting with 172834060f2cec13f59a4bb66978967444939b4aa8dd49fc0c6e0d89974b815e not found: ID does not exist" Dec 16 15:42:15 crc kubenswrapper[4728]: I1216 15:42:15.520961 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" path="/var/lib/kubelet/pods/d92dd2fc-9aa5-4d95-89bb-0e589a4fc983/volumes" Dec 16 15:42:17 crc kubenswrapper[4728]: E1216 15:42:17.104729 4728 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.210:32980->38.102.83.210:34353: write tcp 38.102.83.210:32980->38.102.83.210:34353: write: broken pipe Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.238107 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 15:42:46 crc kubenswrapper[4728]: E1216 15:42:46.239539 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" containerName="registry-server" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.239566 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" containerName="registry-server" Dec 16 15:42:46 crc kubenswrapper[4728]: E1216 15:42:46.239725 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" containerName="extract-utilities" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.239741 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" containerName="extract-utilities" Dec 16 15:42:46 crc kubenswrapper[4728]: E1216 15:42:46.239773 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" containerName="extract-content" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.239784 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" containerName="extract-content" Dec 16 15:42:46 crc kubenswrapper[4728]: E1216 15:42:46.239807 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3adc58c-a09d-4e32-bd59-10d32f1866ca" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.239818 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3adc58c-a09d-4e32-bd59-10d32f1866ca" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.240124 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92dd2fc-9aa5-4d95-89bb-0e589a4fc983" containerName="registry-server" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.240174 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3adc58c-a09d-4e32-bd59-10d32f1866ca" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.241288 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.245643 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-sbfdl" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.245803 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.245820 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.245848 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.248075 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.402153 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.402508 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.402554 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-config-data\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.402599 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.402636 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.402708 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.402731 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfcbt\" (UniqueName: \"kubernetes.io/projected/78bce531-8ad9-43f3-9d5a-2edaf2df712f-kube-api-access-jfcbt\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.402764 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.402804 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.504707 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.504898 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-config-data\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.505032 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.505157 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.505264 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.505331 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfcbt\" (UniqueName: \"kubernetes.io/projected/78bce531-8ad9-43f3-9d5a-2edaf2df712f-kube-api-access-jfcbt\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.505807 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.506193 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.506553 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-config-data\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.506395 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.506865 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.508694 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.509680 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.510259 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.512702 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.512751 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.512810 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.531420 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfcbt\" (UniqueName: \"kubernetes.io/projected/78bce531-8ad9-43f3-9d5a-2edaf2df712f-kube-api-access-jfcbt\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.558524 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " pod="openstack/tempest-tests-tempest" Dec 16 15:42:46 crc kubenswrapper[4728]: I1216 15:42:46.573909 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 15:42:47 crc kubenswrapper[4728]: I1216 15:42:47.065252 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 16 15:42:47 crc kubenswrapper[4728]: I1216 15:42:47.239837 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"78bce531-8ad9-43f3-9d5a-2edaf2df712f","Type":"ContainerStarted","Data":"f8eb5a440334c4b5c78cb6c24fe38dba8897cfda6e69abc296da1b49c0fdb562"} Dec 16 15:43:21 crc kubenswrapper[4728]: E1216 15:43:21.116387 4728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 16 15:43:21 crc kubenswrapper[4728]: E1216 15:43:21.117098 4728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jfcbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(78bce531-8ad9-43f3-9d5a-2edaf2df712f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 16 15:43:21 crc kubenswrapper[4728]: E1216 15:43:21.118324 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="78bce531-8ad9-43f3-9d5a-2edaf2df712f" Dec 16 15:43:21 crc kubenswrapper[4728]: E1216 15:43:21.556131 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="78bce531-8ad9-43f3-9d5a-2edaf2df712f" Dec 16 15:43:33 crc kubenswrapper[4728]: I1216 15:43:33.239520 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 16 15:43:34 crc kubenswrapper[4728]: I1216 15:43:34.661647 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"78bce531-8ad9-43f3-9d5a-2edaf2df712f","Type":"ContainerStarted","Data":"a38f3b5a8143d25704560897243d677a01d82a97f53d1026c4eb459780541662"} Dec 16 15:43:34 crc kubenswrapper[4728]: I1216 15:43:34.683029 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.5217554189999998 podStartE2EDuration="49.683008257s" podCreationTimestamp="2025-12-16 15:42:45 +0000 UTC" firstStartedPulling="2025-12-16 15:42:47.075805748 +0000 UTC m=+2747.915984742" lastFinishedPulling="2025-12-16 15:43:33.237058586 +0000 UTC m=+2794.077237580" observedRunningTime="2025-12-16 15:43:34.678580041 +0000 UTC m=+2795.518759055" watchObservedRunningTime="2025-12-16 15:43:34.683008257 +0000 UTC m=+2795.523187241" Dec 16 15:44:38 crc kubenswrapper[4728]: I1216 15:44:38.818212 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:44:38 crc kubenswrapper[4728]: I1216 15:44:38.818941 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.148331 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7"] Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.151228 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.154858 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.154947 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.160888 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7"] Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.327718 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-config-volume\") pod \"collect-profiles-29431665-8rcj7\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.327829 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-secret-volume\") pod \"collect-profiles-29431665-8rcj7\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.327877 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99gj8\" (UniqueName: \"kubernetes.io/projected/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-kube-api-access-99gj8\") pod \"collect-profiles-29431665-8rcj7\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.429907 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-config-volume\") pod \"collect-profiles-29431665-8rcj7\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.430050 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-secret-volume\") pod \"collect-profiles-29431665-8rcj7\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.430127 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99gj8\" (UniqueName: \"kubernetes.io/projected/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-kube-api-access-99gj8\") pod \"collect-profiles-29431665-8rcj7\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.431220 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-config-volume\") pod \"collect-profiles-29431665-8rcj7\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.437492 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-secret-volume\") pod \"collect-profiles-29431665-8rcj7\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.446201 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99gj8\" (UniqueName: \"kubernetes.io/projected/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-kube-api-access-99gj8\") pod \"collect-profiles-29431665-8rcj7\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.476509 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:00 crc kubenswrapper[4728]: I1216 15:45:00.946153 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7"] Dec 16 15:45:01 crc kubenswrapper[4728]: I1216 15:45:01.426224 4728 generic.go:334] "Generic (PLEG): container finished" podID="6f3bc5cc-b879-48ac-b049-a7ccd0c0748a" containerID="9e23612ee4a218c77491ded38f269a55539065069f33eeea005e9f7eb48b1b17" exitCode=0 Dec 16 15:45:01 crc kubenswrapper[4728]: I1216 15:45:01.426292 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" event={"ID":"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a","Type":"ContainerDied","Data":"9e23612ee4a218c77491ded38f269a55539065069f33eeea005e9f7eb48b1b17"} Dec 16 15:45:01 crc kubenswrapper[4728]: I1216 15:45:01.426590 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" event={"ID":"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a","Type":"ContainerStarted","Data":"746a7b2941db34b1e704f00aa65d7696b40805546ee98534c107d1c261d77f33"} Dec 16 15:45:02 crc kubenswrapper[4728]: I1216 15:45:02.816769 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:02 crc kubenswrapper[4728]: I1216 15:45:02.976199 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99gj8\" (UniqueName: \"kubernetes.io/projected/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-kube-api-access-99gj8\") pod \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " Dec 16 15:45:02 crc kubenswrapper[4728]: I1216 15:45:02.976311 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-secret-volume\") pod \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " Dec 16 15:45:02 crc kubenswrapper[4728]: I1216 15:45:02.976452 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-config-volume\") pod \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\" (UID: \"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a\") " Dec 16 15:45:02 crc kubenswrapper[4728]: I1216 15:45:02.977582 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-config-volume" (OuterVolumeSpecName: "config-volume") pod "6f3bc5cc-b879-48ac-b049-a7ccd0c0748a" (UID: "6f3bc5cc-b879-48ac-b049-a7ccd0c0748a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:45:02 crc kubenswrapper[4728]: I1216 15:45:02.988302 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6f3bc5cc-b879-48ac-b049-a7ccd0c0748a" (UID: "6f3bc5cc-b879-48ac-b049-a7ccd0c0748a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:45:02 crc kubenswrapper[4728]: I1216 15:45:02.988871 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-kube-api-access-99gj8" (OuterVolumeSpecName: "kube-api-access-99gj8") pod "6f3bc5cc-b879-48ac-b049-a7ccd0c0748a" (UID: "6f3bc5cc-b879-48ac-b049-a7ccd0c0748a"). InnerVolumeSpecName "kube-api-access-99gj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:45:03 crc kubenswrapper[4728]: I1216 15:45:03.078503 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:45:03 crc kubenswrapper[4728]: I1216 15:45:03.078806 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 15:45:03 crc kubenswrapper[4728]: I1216 15:45:03.078910 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99gj8\" (UniqueName: \"kubernetes.io/projected/6f3bc5cc-b879-48ac-b049-a7ccd0c0748a-kube-api-access-99gj8\") on node \"crc\" DevicePath \"\"" Dec 16 15:45:03 crc kubenswrapper[4728]: I1216 15:45:03.458023 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" event={"ID":"6f3bc5cc-b879-48ac-b049-a7ccd0c0748a","Type":"ContainerDied","Data":"746a7b2941db34b1e704f00aa65d7696b40805546ee98534c107d1c261d77f33"} Dec 16 15:45:03 crc kubenswrapper[4728]: I1216 15:45:03.458295 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="746a7b2941db34b1e704f00aa65d7696b40805546ee98534c107d1c261d77f33" Dec 16 15:45:03 crc kubenswrapper[4728]: I1216 15:45:03.458083 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431665-8rcj7" Dec 16 15:45:03 crc kubenswrapper[4728]: I1216 15:45:03.902171 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8"] Dec 16 15:45:03 crc kubenswrapper[4728]: I1216 15:45:03.912603 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431620-sv6v8"] Dec 16 15:45:05 crc kubenswrapper[4728]: I1216 15:45:05.519184 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79af2ee0-daef-4cca-ac1b-089d9e5be4ae" path="/var/lib/kubelet/pods/79af2ee0-daef-4cca-ac1b-089d9e5be4ae/volumes" Dec 16 15:45:08 crc kubenswrapper[4728]: I1216 15:45:08.818263 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:45:08 crc kubenswrapper[4728]: I1216 15:45:08.818671 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.349295 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-94w7c"] Dec 16 15:45:10 crc kubenswrapper[4728]: E1216 15:45:10.350284 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3bc5cc-b879-48ac-b049-a7ccd0c0748a" containerName="collect-profiles" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.350307 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3bc5cc-b879-48ac-b049-a7ccd0c0748a" containerName="collect-profiles" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.350733 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3bc5cc-b879-48ac-b049-a7ccd0c0748a" containerName="collect-profiles" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.353365 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.365165 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94w7c"] Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.426009 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c79xx\" (UniqueName: \"kubernetes.io/projected/8c871183-98df-4efe-bb67-b85a643840c1-kube-api-access-c79xx\") pod \"certified-operators-94w7c\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.426085 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-utilities\") pod \"certified-operators-94w7c\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.426106 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-catalog-content\") pod \"certified-operators-94w7c\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.528714 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c79xx\" (UniqueName: \"kubernetes.io/projected/8c871183-98df-4efe-bb67-b85a643840c1-kube-api-access-c79xx\") pod \"certified-operators-94w7c\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.529160 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-utilities\") pod \"certified-operators-94w7c\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.529339 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-catalog-content\") pod \"certified-operators-94w7c\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.530075 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-catalog-content\") pod \"certified-operators-94w7c\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.530089 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-utilities\") pod \"certified-operators-94w7c\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.557224 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c79xx\" (UniqueName: \"kubernetes.io/projected/8c871183-98df-4efe-bb67-b85a643840c1-kube-api-access-c79xx\") pod \"certified-operators-94w7c\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:10 crc kubenswrapper[4728]: I1216 15:45:10.681634 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:11 crc kubenswrapper[4728]: I1216 15:45:11.180588 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94w7c"] Dec 16 15:45:11 crc kubenswrapper[4728]: I1216 15:45:11.530941 4728 generic.go:334] "Generic (PLEG): container finished" podID="8c871183-98df-4efe-bb67-b85a643840c1" containerID="d4533fb06a82c3e3016a4d872f37a3ad65b5a5e13a52893a239b4e79037aa253" exitCode=0 Dec 16 15:45:11 crc kubenswrapper[4728]: I1216 15:45:11.531014 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94w7c" event={"ID":"8c871183-98df-4efe-bb67-b85a643840c1","Type":"ContainerDied","Data":"d4533fb06a82c3e3016a4d872f37a3ad65b5a5e13a52893a239b4e79037aa253"} Dec 16 15:45:11 crc kubenswrapper[4728]: I1216 15:45:11.531061 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94w7c" event={"ID":"8c871183-98df-4efe-bb67-b85a643840c1","Type":"ContainerStarted","Data":"31795d0f352d625e1244a06ada58932c4f09f76b5bd7941517a3e5a1164254f0"} Dec 16 15:45:12 crc kubenswrapper[4728]: I1216 15:45:12.545359 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94w7c" event={"ID":"8c871183-98df-4efe-bb67-b85a643840c1","Type":"ContainerStarted","Data":"34d256864d89307cd86cd9bec47c154ad963defb219cac1298e74c267c1393d2"} Dec 16 15:45:13 crc kubenswrapper[4728]: I1216 15:45:13.556981 4728 generic.go:334] "Generic (PLEG): container finished" podID="8c871183-98df-4efe-bb67-b85a643840c1" containerID="34d256864d89307cd86cd9bec47c154ad963defb219cac1298e74c267c1393d2" exitCode=0 Dec 16 15:45:13 crc kubenswrapper[4728]: I1216 15:45:13.557045 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94w7c" event={"ID":"8c871183-98df-4efe-bb67-b85a643840c1","Type":"ContainerDied","Data":"34d256864d89307cd86cd9bec47c154ad963defb219cac1298e74c267c1393d2"} Dec 16 15:45:14 crc kubenswrapper[4728]: I1216 15:45:14.573305 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94w7c" event={"ID":"8c871183-98df-4efe-bb67-b85a643840c1","Type":"ContainerStarted","Data":"723f0dd8d0121d481117d20746e4ef1a9a28404ea9cd7d6f9ba32454e4d05a6e"} Dec 16 15:45:14 crc kubenswrapper[4728]: I1216 15:45:14.598750 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-94w7c" podStartSLOduration=2.113285054 podStartE2EDuration="4.598715325s" podCreationTimestamp="2025-12-16 15:45:10 +0000 UTC" firstStartedPulling="2025-12-16 15:45:11.533436595 +0000 UTC m=+2892.373615619" lastFinishedPulling="2025-12-16 15:45:14.018866906 +0000 UTC m=+2894.859045890" observedRunningTime="2025-12-16 15:45:14.592798925 +0000 UTC m=+2895.432977949" watchObservedRunningTime="2025-12-16 15:45:14.598715325 +0000 UTC m=+2895.438894309" Dec 16 15:45:18 crc kubenswrapper[4728]: I1216 15:45:18.692775 4728 scope.go:117] "RemoveContainer" containerID="031a2d2e04ec331e2e85075abce071023cfa24f5e62f65467cc25018751e1b98" Dec 16 15:45:20 crc kubenswrapper[4728]: I1216 15:45:20.682844 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:20 crc kubenswrapper[4728]: I1216 15:45:20.683165 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:20 crc kubenswrapper[4728]: I1216 15:45:20.738342 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:21 crc kubenswrapper[4728]: I1216 15:45:21.698533 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:21 crc kubenswrapper[4728]: I1216 15:45:21.758869 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94w7c"] Dec 16 15:45:23 crc kubenswrapper[4728]: I1216 15:45:23.678115 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-94w7c" podUID="8c871183-98df-4efe-bb67-b85a643840c1" containerName="registry-server" containerID="cri-o://723f0dd8d0121d481117d20746e4ef1a9a28404ea9cd7d6f9ba32454e4d05a6e" gracePeriod=2 Dec 16 15:45:24 crc kubenswrapper[4728]: I1216 15:45:24.694994 4728 generic.go:334] "Generic (PLEG): container finished" podID="8c871183-98df-4efe-bb67-b85a643840c1" containerID="723f0dd8d0121d481117d20746e4ef1a9a28404ea9cd7d6f9ba32454e4d05a6e" exitCode=0 Dec 16 15:45:24 crc kubenswrapper[4728]: I1216 15:45:24.695096 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94w7c" event={"ID":"8c871183-98df-4efe-bb67-b85a643840c1","Type":"ContainerDied","Data":"723f0dd8d0121d481117d20746e4ef1a9a28404ea9cd7d6f9ba32454e4d05a6e"} Dec 16 15:45:24 crc kubenswrapper[4728]: I1216 15:45:24.865161 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:24 crc kubenswrapper[4728]: I1216 15:45:24.962545 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c79xx\" (UniqueName: \"kubernetes.io/projected/8c871183-98df-4efe-bb67-b85a643840c1-kube-api-access-c79xx\") pod \"8c871183-98df-4efe-bb67-b85a643840c1\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " Dec 16 15:45:24 crc kubenswrapper[4728]: I1216 15:45:24.962606 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-utilities\") pod \"8c871183-98df-4efe-bb67-b85a643840c1\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " Dec 16 15:45:24 crc kubenswrapper[4728]: I1216 15:45:24.962807 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-catalog-content\") pod \"8c871183-98df-4efe-bb67-b85a643840c1\" (UID: \"8c871183-98df-4efe-bb67-b85a643840c1\") " Dec 16 15:45:24 crc kubenswrapper[4728]: I1216 15:45:24.963513 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-utilities" (OuterVolumeSpecName: "utilities") pod "8c871183-98df-4efe-bb67-b85a643840c1" (UID: "8c871183-98df-4efe-bb67-b85a643840c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:45:24 crc kubenswrapper[4728]: I1216 15:45:24.968621 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c871183-98df-4efe-bb67-b85a643840c1-kube-api-access-c79xx" (OuterVolumeSpecName: "kube-api-access-c79xx") pod "8c871183-98df-4efe-bb67-b85a643840c1" (UID: "8c871183-98df-4efe-bb67-b85a643840c1"). InnerVolumeSpecName "kube-api-access-c79xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:45:25 crc kubenswrapper[4728]: I1216 15:45:25.018547 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c871183-98df-4efe-bb67-b85a643840c1" (UID: "8c871183-98df-4efe-bb67-b85a643840c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:45:25 crc kubenswrapper[4728]: I1216 15:45:25.064815 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:45:25 crc kubenswrapper[4728]: I1216 15:45:25.064860 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c79xx\" (UniqueName: \"kubernetes.io/projected/8c871183-98df-4efe-bb67-b85a643840c1-kube-api-access-c79xx\") on node \"crc\" DevicePath \"\"" Dec 16 15:45:25 crc kubenswrapper[4728]: I1216 15:45:25.064879 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c871183-98df-4efe-bb67-b85a643840c1-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:45:25 crc kubenswrapper[4728]: I1216 15:45:25.713380 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94w7c" event={"ID":"8c871183-98df-4efe-bb67-b85a643840c1","Type":"ContainerDied","Data":"31795d0f352d625e1244a06ada58932c4f09f76b5bd7941517a3e5a1164254f0"} Dec 16 15:45:25 crc kubenswrapper[4728]: I1216 15:45:25.714079 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94w7c" Dec 16 15:45:25 crc kubenswrapper[4728]: I1216 15:45:25.715336 4728 scope.go:117] "RemoveContainer" containerID="723f0dd8d0121d481117d20746e4ef1a9a28404ea9cd7d6f9ba32454e4d05a6e" Dec 16 15:45:25 crc kubenswrapper[4728]: I1216 15:45:25.745326 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94w7c"] Dec 16 15:45:25 crc kubenswrapper[4728]: I1216 15:45:25.752548 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-94w7c"] Dec 16 15:45:25 crc kubenswrapper[4728]: I1216 15:45:25.757806 4728 scope.go:117] "RemoveContainer" containerID="34d256864d89307cd86cd9bec47c154ad963defb219cac1298e74c267c1393d2" Dec 16 15:45:25 crc kubenswrapper[4728]: I1216 15:45:25.782445 4728 scope.go:117] "RemoveContainer" containerID="d4533fb06a82c3e3016a4d872f37a3ad65b5a5e13a52893a239b4e79037aa253" Dec 16 15:45:27 crc kubenswrapper[4728]: I1216 15:45:27.516364 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c871183-98df-4efe-bb67-b85a643840c1" path="/var/lib/kubelet/pods/8c871183-98df-4efe-bb67-b85a643840c1/volumes" Dec 16 15:45:38 crc kubenswrapper[4728]: I1216 15:45:38.819036 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:45:38 crc kubenswrapper[4728]: I1216 15:45:38.819723 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:45:38 crc kubenswrapper[4728]: I1216 15:45:38.819782 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:45:38 crc kubenswrapper[4728]: I1216 15:45:38.820652 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8fc32d05477689ff87ff50e703467d8f097d88b4dace14a4b867ef90c3fad142"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:45:38 crc kubenswrapper[4728]: I1216 15:45:38.820724 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://8fc32d05477689ff87ff50e703467d8f097d88b4dace14a4b867ef90c3fad142" gracePeriod=600 Dec 16 15:45:39 crc kubenswrapper[4728]: I1216 15:45:39.862786 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="8fc32d05477689ff87ff50e703467d8f097d88b4dace14a4b867ef90c3fad142" exitCode=0 Dec 16 15:45:39 crc kubenswrapper[4728]: I1216 15:45:39.862880 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"8fc32d05477689ff87ff50e703467d8f097d88b4dace14a4b867ef90c3fad142"} Dec 16 15:45:39 crc kubenswrapper[4728]: I1216 15:45:39.863401 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55"} Dec 16 15:45:39 crc kubenswrapper[4728]: I1216 15:45:39.863512 4728 scope.go:117] "RemoveContainer" containerID="1150a1e39b05c25a5e10ef69216ae641c03fff72b75bc3223f502a8299b7d1a5" Dec 16 15:48:08 crc kubenswrapper[4728]: I1216 15:48:08.819018 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:48:08 crc kubenswrapper[4728]: I1216 15:48:08.819661 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:48:38 crc kubenswrapper[4728]: I1216 15:48:38.818270 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:48:38 crc kubenswrapper[4728]: I1216 15:48:38.819007 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:49:08 crc kubenswrapper[4728]: I1216 15:49:08.818676 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:49:08 crc kubenswrapper[4728]: I1216 15:49:08.819572 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:49:08 crc kubenswrapper[4728]: I1216 15:49:08.819653 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:49:08 crc kubenswrapper[4728]: I1216 15:49:08.820973 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:49:08 crc kubenswrapper[4728]: I1216 15:49:08.821120 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" gracePeriod=600 Dec 16 15:49:08 crc kubenswrapper[4728]: E1216 15:49:08.949067 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:49:09 crc kubenswrapper[4728]: I1216 15:49:09.061476 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" exitCode=0 Dec 16 15:49:09 crc kubenswrapper[4728]: I1216 15:49:09.061529 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55"} Dec 16 15:49:09 crc kubenswrapper[4728]: I1216 15:49:09.061565 4728 scope.go:117] "RemoveContainer" containerID="8fc32d05477689ff87ff50e703467d8f097d88b4dace14a4b867ef90c3fad142" Dec 16 15:49:09 crc kubenswrapper[4728]: I1216 15:49:09.062375 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:49:09 crc kubenswrapper[4728]: E1216 15:49:09.062969 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:49:19 crc kubenswrapper[4728]: I1216 15:49:19.515534 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:49:19 crc kubenswrapper[4728]: E1216 15:49:19.516563 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:49:34 crc kubenswrapper[4728]: I1216 15:49:34.506836 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:49:34 crc kubenswrapper[4728]: E1216 15:49:34.507432 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:49:48 crc kubenswrapper[4728]: I1216 15:49:48.506622 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:49:48 crc kubenswrapper[4728]: E1216 15:49:48.507590 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.631744 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z9z4x"] Dec 16 15:49:59 crc kubenswrapper[4728]: E1216 15:49:59.632635 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c871183-98df-4efe-bb67-b85a643840c1" containerName="extract-content" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.632648 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c871183-98df-4efe-bb67-b85a643840c1" containerName="extract-content" Dec 16 15:49:59 crc kubenswrapper[4728]: E1216 15:49:59.632671 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c871183-98df-4efe-bb67-b85a643840c1" containerName="extract-utilities" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.632677 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c871183-98df-4efe-bb67-b85a643840c1" containerName="extract-utilities" Dec 16 15:49:59 crc kubenswrapper[4728]: E1216 15:49:59.632698 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c871183-98df-4efe-bb67-b85a643840c1" containerName="registry-server" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.632704 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c871183-98df-4efe-bb67-b85a643840c1" containerName="registry-server" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.632904 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c871183-98df-4efe-bb67-b85a643840c1" containerName="registry-server" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.634375 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.650434 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z9z4x"] Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.742911 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-utilities\") pod \"community-operators-z9z4x\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.742996 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-catalog-content\") pod \"community-operators-z9z4x\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.743426 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csctv\" (UniqueName: \"kubernetes.io/projected/95f6ddba-478c-4b1f-84e6-ef4a0862b271-kube-api-access-csctv\") pod \"community-operators-z9z4x\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.845515 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-utilities\") pod \"community-operators-z9z4x\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.845617 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-catalog-content\") pod \"community-operators-z9z4x\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.845737 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csctv\" (UniqueName: \"kubernetes.io/projected/95f6ddba-478c-4b1f-84e6-ef4a0862b271-kube-api-access-csctv\") pod \"community-operators-z9z4x\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.846456 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-utilities\") pod \"community-operators-z9z4x\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.846538 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-catalog-content\") pod \"community-operators-z9z4x\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.864363 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csctv\" (UniqueName: \"kubernetes.io/projected/95f6ddba-478c-4b1f-84e6-ef4a0862b271-kube-api-access-csctv\") pod \"community-operators-z9z4x\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:49:59 crc kubenswrapper[4728]: I1216 15:49:59.970391 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:50:00 crc kubenswrapper[4728]: I1216 15:50:00.493725 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z9z4x"] Dec 16 15:50:00 crc kubenswrapper[4728]: I1216 15:50:00.582593 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z9z4x" event={"ID":"95f6ddba-478c-4b1f-84e6-ef4a0862b271","Type":"ContainerStarted","Data":"f9ff259b6a1f1d7f90fb27d0cd0e38db3c2f2f2affdcd6f36092c08d287bf528"} Dec 16 15:50:01 crc kubenswrapper[4728]: I1216 15:50:01.507654 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:50:01 crc kubenswrapper[4728]: E1216 15:50:01.508242 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:50:01 crc kubenswrapper[4728]: I1216 15:50:01.616849 4728 generic.go:334] "Generic (PLEG): container finished" podID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" containerID="0537203ac2442126b26449ecb8d5dac45a5b70aecf7f92360ce34144695e3ed6" exitCode=0 Dec 16 15:50:01 crc kubenswrapper[4728]: I1216 15:50:01.616924 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z9z4x" event={"ID":"95f6ddba-478c-4b1f-84e6-ef4a0862b271","Type":"ContainerDied","Data":"0537203ac2442126b26449ecb8d5dac45a5b70aecf7f92360ce34144695e3ed6"} Dec 16 15:50:01 crc kubenswrapper[4728]: I1216 15:50:01.623066 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:50:03 crc kubenswrapper[4728]: I1216 15:50:03.634268 4728 generic.go:334] "Generic (PLEG): container finished" podID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" containerID="634b3b524427deed494b12abd12f3a0fd1ca55f7d2eff05cf267292bd0ce3dba" exitCode=0 Dec 16 15:50:03 crc kubenswrapper[4728]: I1216 15:50:03.634928 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z9z4x" event={"ID":"95f6ddba-478c-4b1f-84e6-ef4a0862b271","Type":"ContainerDied","Data":"634b3b524427deed494b12abd12f3a0fd1ca55f7d2eff05cf267292bd0ce3dba"} Dec 16 15:50:05 crc kubenswrapper[4728]: I1216 15:50:05.654666 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z9z4x" event={"ID":"95f6ddba-478c-4b1f-84e6-ef4a0862b271","Type":"ContainerStarted","Data":"5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c"} Dec 16 15:50:05 crc kubenswrapper[4728]: I1216 15:50:05.679040 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z9z4x" podStartSLOduration=3.012616185 podStartE2EDuration="6.679020315s" podCreationTimestamp="2025-12-16 15:49:59 +0000 UTC" firstStartedPulling="2025-12-16 15:50:01.62267551 +0000 UTC m=+3182.462854524" lastFinishedPulling="2025-12-16 15:50:05.28907967 +0000 UTC m=+3186.129258654" observedRunningTime="2025-12-16 15:50:05.678071119 +0000 UTC m=+3186.518250143" watchObservedRunningTime="2025-12-16 15:50:05.679020315 +0000 UTC m=+3186.519199299" Dec 16 15:50:09 crc kubenswrapper[4728]: I1216 15:50:09.971122 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:50:09 crc kubenswrapper[4728]: I1216 15:50:09.971762 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:50:10 crc kubenswrapper[4728]: I1216 15:50:10.033998 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:50:10 crc kubenswrapper[4728]: I1216 15:50:10.772385 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:50:10 crc kubenswrapper[4728]: I1216 15:50:10.838994 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z9z4x"] Dec 16 15:50:12 crc kubenswrapper[4728]: I1216 15:50:12.723222 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z9z4x" podUID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" containerName="registry-server" containerID="cri-o://5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c" gracePeriod=2 Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.213670 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.332721 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csctv\" (UniqueName: \"kubernetes.io/projected/95f6ddba-478c-4b1f-84e6-ef4a0862b271-kube-api-access-csctv\") pod \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.332894 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-catalog-content\") pod \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.332930 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-utilities\") pod \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\" (UID: \"95f6ddba-478c-4b1f-84e6-ef4a0862b271\") " Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.334681 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-utilities" (OuterVolumeSpecName: "utilities") pod "95f6ddba-478c-4b1f-84e6-ef4a0862b271" (UID: "95f6ddba-478c-4b1f-84e6-ef4a0862b271"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.339770 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f6ddba-478c-4b1f-84e6-ef4a0862b271-kube-api-access-csctv" (OuterVolumeSpecName: "kube-api-access-csctv") pod "95f6ddba-478c-4b1f-84e6-ef4a0862b271" (UID: "95f6ddba-478c-4b1f-84e6-ef4a0862b271"). InnerVolumeSpecName "kube-api-access-csctv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.389708 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95f6ddba-478c-4b1f-84e6-ef4a0862b271" (UID: "95f6ddba-478c-4b1f-84e6-ef4a0862b271"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.435501 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.435536 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f6ddba-478c-4b1f-84e6-ef4a0862b271-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.435545 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csctv\" (UniqueName: \"kubernetes.io/projected/95f6ddba-478c-4b1f-84e6-ef4a0862b271-kube-api-access-csctv\") on node \"crc\" DevicePath \"\"" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.507056 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:50:13 crc kubenswrapper[4728]: E1216 15:50:13.507501 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.737151 4728 generic.go:334] "Generic (PLEG): container finished" podID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" containerID="5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c" exitCode=0 Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.737238 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z9z4x" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.737263 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z9z4x" event={"ID":"95f6ddba-478c-4b1f-84e6-ef4a0862b271","Type":"ContainerDied","Data":"5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c"} Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.737625 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z9z4x" event={"ID":"95f6ddba-478c-4b1f-84e6-ef4a0862b271","Type":"ContainerDied","Data":"f9ff259b6a1f1d7f90fb27d0cd0e38db3c2f2f2affdcd6f36092c08d287bf528"} Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.737643 4728 scope.go:117] "RemoveContainer" containerID="5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.760157 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z9z4x"] Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.768788 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z9z4x"] Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.777149 4728 scope.go:117] "RemoveContainer" containerID="634b3b524427deed494b12abd12f3a0fd1ca55f7d2eff05cf267292bd0ce3dba" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.795455 4728 scope.go:117] "RemoveContainer" containerID="0537203ac2442126b26449ecb8d5dac45a5b70aecf7f92360ce34144695e3ed6" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.839389 4728 scope.go:117] "RemoveContainer" containerID="5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c" Dec 16 15:50:13 crc kubenswrapper[4728]: E1216 15:50:13.840998 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c\": container with ID starting with 5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c not found: ID does not exist" containerID="5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.841043 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c"} err="failed to get container status \"5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c\": rpc error: code = NotFound desc = could not find container \"5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c\": container with ID starting with 5c02e520c14d1275d0ffc1ab150dffb0e5fb4b8247cb53ef506213bc334c3e3c not found: ID does not exist" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.841070 4728 scope.go:117] "RemoveContainer" containerID="634b3b524427deed494b12abd12f3a0fd1ca55f7d2eff05cf267292bd0ce3dba" Dec 16 15:50:13 crc kubenswrapper[4728]: E1216 15:50:13.841578 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"634b3b524427deed494b12abd12f3a0fd1ca55f7d2eff05cf267292bd0ce3dba\": container with ID starting with 634b3b524427deed494b12abd12f3a0fd1ca55f7d2eff05cf267292bd0ce3dba not found: ID does not exist" containerID="634b3b524427deed494b12abd12f3a0fd1ca55f7d2eff05cf267292bd0ce3dba" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.841676 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634b3b524427deed494b12abd12f3a0fd1ca55f7d2eff05cf267292bd0ce3dba"} err="failed to get container status \"634b3b524427deed494b12abd12f3a0fd1ca55f7d2eff05cf267292bd0ce3dba\": rpc error: code = NotFound desc = could not find container \"634b3b524427deed494b12abd12f3a0fd1ca55f7d2eff05cf267292bd0ce3dba\": container with ID starting with 634b3b524427deed494b12abd12f3a0fd1ca55f7d2eff05cf267292bd0ce3dba not found: ID does not exist" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.841753 4728 scope.go:117] "RemoveContainer" containerID="0537203ac2442126b26449ecb8d5dac45a5b70aecf7f92360ce34144695e3ed6" Dec 16 15:50:13 crc kubenswrapper[4728]: E1216 15:50:13.842282 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0537203ac2442126b26449ecb8d5dac45a5b70aecf7f92360ce34144695e3ed6\": container with ID starting with 0537203ac2442126b26449ecb8d5dac45a5b70aecf7f92360ce34144695e3ed6 not found: ID does not exist" containerID="0537203ac2442126b26449ecb8d5dac45a5b70aecf7f92360ce34144695e3ed6" Dec 16 15:50:13 crc kubenswrapper[4728]: I1216 15:50:13.842339 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0537203ac2442126b26449ecb8d5dac45a5b70aecf7f92360ce34144695e3ed6"} err="failed to get container status \"0537203ac2442126b26449ecb8d5dac45a5b70aecf7f92360ce34144695e3ed6\": rpc error: code = NotFound desc = could not find container \"0537203ac2442126b26449ecb8d5dac45a5b70aecf7f92360ce34144695e3ed6\": container with ID starting with 0537203ac2442126b26449ecb8d5dac45a5b70aecf7f92360ce34144695e3ed6 not found: ID does not exist" Dec 16 15:50:15 crc kubenswrapper[4728]: I1216 15:50:15.521295 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" path="/var/lib/kubelet/pods/95f6ddba-478c-4b1f-84e6-ef4a0862b271/volumes" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.672151 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v4m4t"] Dec 16 15:50:22 crc kubenswrapper[4728]: E1216 15:50:22.673038 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" containerName="extract-utilities" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.673050 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" containerName="extract-utilities" Dec 16 15:50:22 crc kubenswrapper[4728]: E1216 15:50:22.673067 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" containerName="extract-content" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.673075 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" containerName="extract-content" Dec 16 15:50:22 crc kubenswrapper[4728]: E1216 15:50:22.673085 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" containerName="registry-server" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.673093 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" containerName="registry-server" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.673255 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f6ddba-478c-4b1f-84e6-ef4a0862b271" containerName="registry-server" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.674549 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.682236 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v4m4t"] Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.816816 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-catalog-content\") pod \"redhat-operators-v4m4t\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.816927 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkkp5\" (UniqueName: \"kubernetes.io/projected/ca508d5e-a593-4489-899d-79f74665c193-kube-api-access-fkkp5\") pod \"redhat-operators-v4m4t\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.817167 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-utilities\") pod \"redhat-operators-v4m4t\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.919236 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkkp5\" (UniqueName: \"kubernetes.io/projected/ca508d5e-a593-4489-899d-79f74665c193-kube-api-access-fkkp5\") pod \"redhat-operators-v4m4t\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.919725 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-utilities\") pod \"redhat-operators-v4m4t\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.919869 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-catalog-content\") pod \"redhat-operators-v4m4t\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.920394 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-utilities\") pod \"redhat-operators-v4m4t\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.921576 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-catalog-content\") pod \"redhat-operators-v4m4t\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:22 crc kubenswrapper[4728]: I1216 15:50:22.950483 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkkp5\" (UniqueName: \"kubernetes.io/projected/ca508d5e-a593-4489-899d-79f74665c193-kube-api-access-fkkp5\") pod \"redhat-operators-v4m4t\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:23 crc kubenswrapper[4728]: I1216 15:50:23.006129 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:23 crc kubenswrapper[4728]: I1216 15:50:23.518458 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v4m4t"] Dec 16 15:50:23 crc kubenswrapper[4728]: I1216 15:50:23.825787 4728 generic.go:334] "Generic (PLEG): container finished" podID="ca508d5e-a593-4489-899d-79f74665c193" containerID="b38d6178504465e7e52689838440c8ad8adee0a00b4f332f50c8ddc13f64ac52" exitCode=0 Dec 16 15:50:23 crc kubenswrapper[4728]: I1216 15:50:23.825883 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4m4t" event={"ID":"ca508d5e-a593-4489-899d-79f74665c193","Type":"ContainerDied","Data":"b38d6178504465e7e52689838440c8ad8adee0a00b4f332f50c8ddc13f64ac52"} Dec 16 15:50:23 crc kubenswrapper[4728]: I1216 15:50:23.826109 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4m4t" event={"ID":"ca508d5e-a593-4489-899d-79f74665c193","Type":"ContainerStarted","Data":"a9b59b23cbbfd963f39dd78df51b857b9757cba6ef0b37bec4c66da5d1923499"} Dec 16 15:50:25 crc kubenswrapper[4728]: I1216 15:50:25.854667 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4m4t" event={"ID":"ca508d5e-a593-4489-899d-79f74665c193","Type":"ContainerStarted","Data":"b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e"} Dec 16 15:50:26 crc kubenswrapper[4728]: I1216 15:50:26.506986 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:50:26 crc kubenswrapper[4728]: E1216 15:50:26.507309 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:50:29 crc kubenswrapper[4728]: I1216 15:50:29.889833 4728 generic.go:334] "Generic (PLEG): container finished" podID="ca508d5e-a593-4489-899d-79f74665c193" containerID="b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e" exitCode=0 Dec 16 15:50:29 crc kubenswrapper[4728]: I1216 15:50:29.889912 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4m4t" event={"ID":"ca508d5e-a593-4489-899d-79f74665c193","Type":"ContainerDied","Data":"b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e"} Dec 16 15:50:30 crc kubenswrapper[4728]: I1216 15:50:30.901807 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4m4t" event={"ID":"ca508d5e-a593-4489-899d-79f74665c193","Type":"ContainerStarted","Data":"6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310"} Dec 16 15:50:30 crc kubenswrapper[4728]: I1216 15:50:30.922283 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v4m4t" podStartSLOduration=2.074343192 podStartE2EDuration="8.922260137s" podCreationTimestamp="2025-12-16 15:50:22 +0000 UTC" firstStartedPulling="2025-12-16 15:50:23.827310272 +0000 UTC m=+3204.667489256" lastFinishedPulling="2025-12-16 15:50:30.675227207 +0000 UTC m=+3211.515406201" observedRunningTime="2025-12-16 15:50:30.920063028 +0000 UTC m=+3211.760242022" watchObservedRunningTime="2025-12-16 15:50:30.922260137 +0000 UTC m=+3211.762439121" Dec 16 15:50:33 crc kubenswrapper[4728]: I1216 15:50:33.006368 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:33 crc kubenswrapper[4728]: I1216 15:50:33.006628 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:34 crc kubenswrapper[4728]: I1216 15:50:34.050232 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v4m4t" podUID="ca508d5e-a593-4489-899d-79f74665c193" containerName="registry-server" probeResult="failure" output=< Dec 16 15:50:34 crc kubenswrapper[4728]: timeout: failed to connect service ":50051" within 1s Dec 16 15:50:34 crc kubenswrapper[4728]: > Dec 16 15:50:40 crc kubenswrapper[4728]: I1216 15:50:40.506639 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:50:40 crc kubenswrapper[4728]: E1216 15:50:40.507874 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:50:43 crc kubenswrapper[4728]: I1216 15:50:43.091536 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:43 crc kubenswrapper[4728]: I1216 15:50:43.190858 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:43 crc kubenswrapper[4728]: I1216 15:50:43.334568 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v4m4t"] Dec 16 15:50:45 crc kubenswrapper[4728]: I1216 15:50:45.041029 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v4m4t" podUID="ca508d5e-a593-4489-899d-79f74665c193" containerName="registry-server" containerID="cri-o://6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310" gracePeriod=2 Dec 16 15:50:45 crc kubenswrapper[4728]: I1216 15:50:45.527143 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:45 crc kubenswrapper[4728]: I1216 15:50:45.688073 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkkp5\" (UniqueName: \"kubernetes.io/projected/ca508d5e-a593-4489-899d-79f74665c193-kube-api-access-fkkp5\") pod \"ca508d5e-a593-4489-899d-79f74665c193\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " Dec 16 15:50:45 crc kubenswrapper[4728]: I1216 15:50:45.688464 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-utilities\") pod \"ca508d5e-a593-4489-899d-79f74665c193\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " Dec 16 15:50:45 crc kubenswrapper[4728]: I1216 15:50:45.688499 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-catalog-content\") pod \"ca508d5e-a593-4489-899d-79f74665c193\" (UID: \"ca508d5e-a593-4489-899d-79f74665c193\") " Dec 16 15:50:45 crc kubenswrapper[4728]: I1216 15:50:45.690039 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-utilities" (OuterVolumeSpecName: "utilities") pod "ca508d5e-a593-4489-899d-79f74665c193" (UID: "ca508d5e-a593-4489-899d-79f74665c193"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:50:45 crc kubenswrapper[4728]: I1216 15:50:45.704953 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca508d5e-a593-4489-899d-79f74665c193-kube-api-access-fkkp5" (OuterVolumeSpecName: "kube-api-access-fkkp5") pod "ca508d5e-a593-4489-899d-79f74665c193" (UID: "ca508d5e-a593-4489-899d-79f74665c193"). InnerVolumeSpecName "kube-api-access-fkkp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:50:45 crc kubenswrapper[4728]: I1216 15:50:45.791278 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:50:45 crc kubenswrapper[4728]: I1216 15:50:45.791317 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkkp5\" (UniqueName: \"kubernetes.io/projected/ca508d5e-a593-4489-899d-79f74665c193-kube-api-access-fkkp5\") on node \"crc\" DevicePath \"\"" Dec 16 15:50:45 crc kubenswrapper[4728]: I1216 15:50:45.826844 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca508d5e-a593-4489-899d-79f74665c193" (UID: "ca508d5e-a593-4489-899d-79f74665c193"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:50:45 crc kubenswrapper[4728]: I1216 15:50:45.894363 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca508d5e-a593-4489-899d-79f74665c193-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.055950 4728 generic.go:334] "Generic (PLEG): container finished" podID="ca508d5e-a593-4489-899d-79f74665c193" containerID="6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310" exitCode=0 Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.055996 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4m4t" event={"ID":"ca508d5e-a593-4489-899d-79f74665c193","Type":"ContainerDied","Data":"6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310"} Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.056030 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4m4t" event={"ID":"ca508d5e-a593-4489-899d-79f74665c193","Type":"ContainerDied","Data":"a9b59b23cbbfd963f39dd78df51b857b9757cba6ef0b37bec4c66da5d1923499"} Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.056051 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4m4t" Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.056063 4728 scope.go:117] "RemoveContainer" containerID="6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310" Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.087640 4728 scope.go:117] "RemoveContainer" containerID="b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e" Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.103728 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v4m4t"] Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.112754 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v4m4t"] Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.112882 4728 scope.go:117] "RemoveContainer" containerID="b38d6178504465e7e52689838440c8ad8adee0a00b4f332f50c8ddc13f64ac52" Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.155360 4728 scope.go:117] "RemoveContainer" containerID="6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310" Dec 16 15:50:46 crc kubenswrapper[4728]: E1216 15:50:46.155865 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310\": container with ID starting with 6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310 not found: ID does not exist" containerID="6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310" Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.155913 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310"} err="failed to get container status \"6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310\": rpc error: code = NotFound desc = could not find container \"6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310\": container with ID starting with 6bb59131eabf02a8067fee72cd1cf8b39a64b1384b7e5c49438bef0c22e15310 not found: ID does not exist" Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.155946 4728 scope.go:117] "RemoveContainer" containerID="b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e" Dec 16 15:50:46 crc kubenswrapper[4728]: E1216 15:50:46.156451 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e\": container with ID starting with b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e not found: ID does not exist" containerID="b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e" Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.156492 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e"} err="failed to get container status \"b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e\": rpc error: code = NotFound desc = could not find container \"b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e\": container with ID starting with b9af1f05d900a1f64e936c29a863ada23e0a7ce0d49e68ec30facc298138d94e not found: ID does not exist" Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.156514 4728 scope.go:117] "RemoveContainer" containerID="b38d6178504465e7e52689838440c8ad8adee0a00b4f332f50c8ddc13f64ac52" Dec 16 15:50:46 crc kubenswrapper[4728]: E1216 15:50:46.156838 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b38d6178504465e7e52689838440c8ad8adee0a00b4f332f50c8ddc13f64ac52\": container with ID starting with b38d6178504465e7e52689838440c8ad8adee0a00b4f332f50c8ddc13f64ac52 not found: ID does not exist" containerID="b38d6178504465e7e52689838440c8ad8adee0a00b4f332f50c8ddc13f64ac52" Dec 16 15:50:46 crc kubenswrapper[4728]: I1216 15:50:46.156876 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b38d6178504465e7e52689838440c8ad8adee0a00b4f332f50c8ddc13f64ac52"} err="failed to get container status \"b38d6178504465e7e52689838440c8ad8adee0a00b4f332f50c8ddc13f64ac52\": rpc error: code = NotFound desc = could not find container \"b38d6178504465e7e52689838440c8ad8adee0a00b4f332f50c8ddc13f64ac52\": container with ID starting with b38d6178504465e7e52689838440c8ad8adee0a00b4f332f50c8ddc13f64ac52 not found: ID does not exist" Dec 16 15:50:47 crc kubenswrapper[4728]: I1216 15:50:47.517599 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca508d5e-a593-4489-899d-79f74665c193" path="/var/lib/kubelet/pods/ca508d5e-a593-4489-899d-79f74665c193/volumes" Dec 16 15:50:52 crc kubenswrapper[4728]: I1216 15:50:52.506574 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:50:52 crc kubenswrapper[4728]: E1216 15:50:52.507635 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:51:07 crc kubenswrapper[4728]: I1216 15:51:07.506708 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:51:07 crc kubenswrapper[4728]: E1216 15:51:07.508000 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:51:19 crc kubenswrapper[4728]: I1216 15:51:19.512444 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:51:19 crc kubenswrapper[4728]: E1216 15:51:19.513226 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:51:31 crc kubenswrapper[4728]: I1216 15:51:31.507069 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:51:31 crc kubenswrapper[4728]: E1216 15:51:31.507917 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:51:46 crc kubenswrapper[4728]: I1216 15:51:46.506515 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:51:46 crc kubenswrapper[4728]: E1216 15:51:46.507614 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:51:57 crc kubenswrapper[4728]: I1216 15:51:57.506772 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:51:57 crc kubenswrapper[4728]: E1216 15:51:57.507928 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.103234 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6jvv6"] Dec 16 15:52:03 crc kubenswrapper[4728]: E1216 15:52:03.104904 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca508d5e-a593-4489-899d-79f74665c193" containerName="extract-content" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.104924 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca508d5e-a593-4489-899d-79f74665c193" containerName="extract-content" Dec 16 15:52:03 crc kubenswrapper[4728]: E1216 15:52:03.104947 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca508d5e-a593-4489-899d-79f74665c193" containerName="extract-utilities" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.104969 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca508d5e-a593-4489-899d-79f74665c193" containerName="extract-utilities" Dec 16 15:52:03 crc kubenswrapper[4728]: E1216 15:52:03.104984 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca508d5e-a593-4489-899d-79f74665c193" containerName="registry-server" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.104991 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca508d5e-a593-4489-899d-79f74665c193" containerName="registry-server" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.105217 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca508d5e-a593-4489-899d-79f74665c193" containerName="registry-server" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.107186 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.119286 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jvv6"] Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.224036 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-utilities\") pod \"redhat-marketplace-6jvv6\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.224097 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc2d8\" (UniqueName: \"kubernetes.io/projected/04b75b38-f1a1-42ca-a68d-28eded69163f-kube-api-access-tc2d8\") pod \"redhat-marketplace-6jvv6\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.224196 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-catalog-content\") pod \"redhat-marketplace-6jvv6\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.325818 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc2d8\" (UniqueName: \"kubernetes.io/projected/04b75b38-f1a1-42ca-a68d-28eded69163f-kube-api-access-tc2d8\") pod \"redhat-marketplace-6jvv6\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.325951 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-catalog-content\") pod \"redhat-marketplace-6jvv6\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.326078 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-utilities\") pod \"redhat-marketplace-6jvv6\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.326487 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-catalog-content\") pod \"redhat-marketplace-6jvv6\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.326511 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-utilities\") pod \"redhat-marketplace-6jvv6\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.346713 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc2d8\" (UniqueName: \"kubernetes.io/projected/04b75b38-f1a1-42ca-a68d-28eded69163f-kube-api-access-tc2d8\") pod \"redhat-marketplace-6jvv6\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.436878 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:03 crc kubenswrapper[4728]: I1216 15:52:03.932514 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jvv6"] Dec 16 15:52:04 crc kubenswrapper[4728]: I1216 15:52:04.956770 4728 generic.go:334] "Generic (PLEG): container finished" podID="04b75b38-f1a1-42ca-a68d-28eded69163f" containerID="eedc1806fae8c2f1f044aaa07a27d3be53e5e7b3a8f27ba7d1373475e2776cf0" exitCode=0 Dec 16 15:52:04 crc kubenswrapper[4728]: I1216 15:52:04.956847 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jvv6" event={"ID":"04b75b38-f1a1-42ca-a68d-28eded69163f","Type":"ContainerDied","Data":"eedc1806fae8c2f1f044aaa07a27d3be53e5e7b3a8f27ba7d1373475e2776cf0"} Dec 16 15:52:04 crc kubenswrapper[4728]: I1216 15:52:04.957093 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jvv6" event={"ID":"04b75b38-f1a1-42ca-a68d-28eded69163f","Type":"ContainerStarted","Data":"89a1e53842e414baec65f9c257e396ef859fd6e76a3b77f6a91bd5cee978c9f7"} Dec 16 15:52:06 crc kubenswrapper[4728]: I1216 15:52:06.975724 4728 generic.go:334] "Generic (PLEG): container finished" podID="04b75b38-f1a1-42ca-a68d-28eded69163f" containerID="56232cf3641524a93b4851f9041d041870e2cd4ab5a750009fc4df378d6520b8" exitCode=0 Dec 16 15:52:06 crc kubenswrapper[4728]: I1216 15:52:06.975798 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jvv6" event={"ID":"04b75b38-f1a1-42ca-a68d-28eded69163f","Type":"ContainerDied","Data":"56232cf3641524a93b4851f9041d041870e2cd4ab5a750009fc4df378d6520b8"} Dec 16 15:52:07 crc kubenswrapper[4728]: I1216 15:52:07.987595 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jvv6" event={"ID":"04b75b38-f1a1-42ca-a68d-28eded69163f","Type":"ContainerStarted","Data":"76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce"} Dec 16 15:52:08 crc kubenswrapper[4728]: I1216 15:52:08.011326 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6jvv6" podStartSLOduration=2.4572633010000002 podStartE2EDuration="5.011306989s" podCreationTimestamp="2025-12-16 15:52:03 +0000 UTC" firstStartedPulling="2025-12-16 15:52:04.962241274 +0000 UTC m=+3305.802420258" lastFinishedPulling="2025-12-16 15:52:07.516284932 +0000 UTC m=+3308.356463946" observedRunningTime="2025-12-16 15:52:08.00432313 +0000 UTC m=+3308.844502104" watchObservedRunningTime="2025-12-16 15:52:08.011306989 +0000 UTC m=+3308.851485973" Dec 16 15:52:12 crc kubenswrapper[4728]: I1216 15:52:12.506827 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:52:12 crc kubenswrapper[4728]: E1216 15:52:12.507717 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:52:13 crc kubenswrapper[4728]: I1216 15:52:13.437033 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:13 crc kubenswrapper[4728]: I1216 15:52:13.437434 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:13 crc kubenswrapper[4728]: I1216 15:52:13.488990 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:14 crc kubenswrapper[4728]: I1216 15:52:14.098388 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:14 crc kubenswrapper[4728]: I1216 15:52:14.169499 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jvv6"] Dec 16 15:52:16 crc kubenswrapper[4728]: I1216 15:52:16.067321 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6jvv6" podUID="04b75b38-f1a1-42ca-a68d-28eded69163f" containerName="registry-server" containerID="cri-o://76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce" gracePeriod=2 Dec 16 15:52:16 crc kubenswrapper[4728]: I1216 15:52:16.590619 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:16 crc kubenswrapper[4728]: I1216 15:52:16.639195 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-utilities\") pod \"04b75b38-f1a1-42ca-a68d-28eded69163f\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " Dec 16 15:52:16 crc kubenswrapper[4728]: I1216 15:52:16.639236 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc2d8\" (UniqueName: \"kubernetes.io/projected/04b75b38-f1a1-42ca-a68d-28eded69163f-kube-api-access-tc2d8\") pod \"04b75b38-f1a1-42ca-a68d-28eded69163f\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " Dec 16 15:52:16 crc kubenswrapper[4728]: I1216 15:52:16.640190 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-utilities" (OuterVolumeSpecName: "utilities") pod "04b75b38-f1a1-42ca-a68d-28eded69163f" (UID: "04b75b38-f1a1-42ca-a68d-28eded69163f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:52:16 crc kubenswrapper[4728]: I1216 15:52:16.644692 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b75b38-f1a1-42ca-a68d-28eded69163f-kube-api-access-tc2d8" (OuterVolumeSpecName: "kube-api-access-tc2d8") pod "04b75b38-f1a1-42ca-a68d-28eded69163f" (UID: "04b75b38-f1a1-42ca-a68d-28eded69163f"). InnerVolumeSpecName "kube-api-access-tc2d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:52:16 crc kubenswrapper[4728]: I1216 15:52:16.740498 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-catalog-content\") pod \"04b75b38-f1a1-42ca-a68d-28eded69163f\" (UID: \"04b75b38-f1a1-42ca-a68d-28eded69163f\") " Dec 16 15:52:16 crc kubenswrapper[4728]: I1216 15:52:16.741207 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:16 crc kubenswrapper[4728]: I1216 15:52:16.741229 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc2d8\" (UniqueName: \"kubernetes.io/projected/04b75b38-f1a1-42ca-a68d-28eded69163f-kube-api-access-tc2d8\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:16 crc kubenswrapper[4728]: I1216 15:52:16.760369 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04b75b38-f1a1-42ca-a68d-28eded69163f" (UID: "04b75b38-f1a1-42ca-a68d-28eded69163f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:52:16 crc kubenswrapper[4728]: I1216 15:52:16.842463 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b75b38-f1a1-42ca-a68d-28eded69163f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.079776 4728 generic.go:334] "Generic (PLEG): container finished" podID="04b75b38-f1a1-42ca-a68d-28eded69163f" containerID="76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce" exitCode=0 Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.079816 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jvv6" event={"ID":"04b75b38-f1a1-42ca-a68d-28eded69163f","Type":"ContainerDied","Data":"76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce"} Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.079841 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jvv6" event={"ID":"04b75b38-f1a1-42ca-a68d-28eded69163f","Type":"ContainerDied","Data":"89a1e53842e414baec65f9c257e396ef859fd6e76a3b77f6a91bd5cee978c9f7"} Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.079859 4728 scope.go:117] "RemoveContainer" containerID="76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce" Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.079967 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jvv6" Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.113651 4728 scope.go:117] "RemoveContainer" containerID="56232cf3641524a93b4851f9041d041870e2cd4ab5a750009fc4df378d6520b8" Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.123742 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jvv6"] Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.138936 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jvv6"] Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.147564 4728 scope.go:117] "RemoveContainer" containerID="eedc1806fae8c2f1f044aaa07a27d3be53e5e7b3a8f27ba7d1373475e2776cf0" Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.193014 4728 scope.go:117] "RemoveContainer" containerID="76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce" Dec 16 15:52:17 crc kubenswrapper[4728]: E1216 15:52:17.193544 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce\": container with ID starting with 76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce not found: ID does not exist" containerID="76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce" Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.193584 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce"} err="failed to get container status \"76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce\": rpc error: code = NotFound desc = could not find container \"76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce\": container with ID starting with 76761703a1fc54977c417079262d7fa972bae181b86072dedd5170f1d24d7dce not found: ID does not exist" Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.193610 4728 scope.go:117] "RemoveContainer" containerID="56232cf3641524a93b4851f9041d041870e2cd4ab5a750009fc4df378d6520b8" Dec 16 15:52:17 crc kubenswrapper[4728]: E1216 15:52:17.194011 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56232cf3641524a93b4851f9041d041870e2cd4ab5a750009fc4df378d6520b8\": container with ID starting with 56232cf3641524a93b4851f9041d041870e2cd4ab5a750009fc4df378d6520b8 not found: ID does not exist" containerID="56232cf3641524a93b4851f9041d041870e2cd4ab5a750009fc4df378d6520b8" Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.194043 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56232cf3641524a93b4851f9041d041870e2cd4ab5a750009fc4df378d6520b8"} err="failed to get container status \"56232cf3641524a93b4851f9041d041870e2cd4ab5a750009fc4df378d6520b8\": rpc error: code = NotFound desc = could not find container \"56232cf3641524a93b4851f9041d041870e2cd4ab5a750009fc4df378d6520b8\": container with ID starting with 56232cf3641524a93b4851f9041d041870e2cd4ab5a750009fc4df378d6520b8 not found: ID does not exist" Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.194063 4728 scope.go:117] "RemoveContainer" containerID="eedc1806fae8c2f1f044aaa07a27d3be53e5e7b3a8f27ba7d1373475e2776cf0" Dec 16 15:52:17 crc kubenswrapper[4728]: E1216 15:52:17.194571 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eedc1806fae8c2f1f044aaa07a27d3be53e5e7b3a8f27ba7d1373475e2776cf0\": container with ID starting with eedc1806fae8c2f1f044aaa07a27d3be53e5e7b3a8f27ba7d1373475e2776cf0 not found: ID does not exist" containerID="eedc1806fae8c2f1f044aaa07a27d3be53e5e7b3a8f27ba7d1373475e2776cf0" Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.194629 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eedc1806fae8c2f1f044aaa07a27d3be53e5e7b3a8f27ba7d1373475e2776cf0"} err="failed to get container status \"eedc1806fae8c2f1f044aaa07a27d3be53e5e7b3a8f27ba7d1373475e2776cf0\": rpc error: code = NotFound desc = could not find container \"eedc1806fae8c2f1f044aaa07a27d3be53e5e7b3a8f27ba7d1373475e2776cf0\": container with ID starting with eedc1806fae8c2f1f044aaa07a27d3be53e5e7b3a8f27ba7d1373475e2776cf0 not found: ID does not exist" Dec 16 15:52:17 crc kubenswrapper[4728]: I1216 15:52:17.519054 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b75b38-f1a1-42ca-a68d-28eded69163f" path="/var/lib/kubelet/pods/04b75b38-f1a1-42ca-a68d-28eded69163f/volumes" Dec 16 15:52:25 crc kubenswrapper[4728]: I1216 15:52:25.507212 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:52:25 crc kubenswrapper[4728]: E1216 15:52:25.508285 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:52:40 crc kubenswrapper[4728]: I1216 15:52:40.506085 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:52:40 crc kubenswrapper[4728]: E1216 15:52:40.506850 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:52:52 crc kubenswrapper[4728]: I1216 15:52:52.507045 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:52:52 crc kubenswrapper[4728]: E1216 15:52:52.507788 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:53:06 crc kubenswrapper[4728]: I1216 15:53:06.507740 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:53:06 crc kubenswrapper[4728]: E1216 15:53:06.508513 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:53:17 crc kubenswrapper[4728]: I1216 15:53:17.507487 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:53:17 crc kubenswrapper[4728]: E1216 15:53:17.508117 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:53:31 crc kubenswrapper[4728]: I1216 15:53:31.506867 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:53:31 crc kubenswrapper[4728]: E1216 15:53:31.512175 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:53:42 crc kubenswrapper[4728]: I1216 15:53:42.507160 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:53:42 crc kubenswrapper[4728]: E1216 15:53:42.508499 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:53:53 crc kubenswrapper[4728]: I1216 15:53:53.507378 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:53:53 crc kubenswrapper[4728]: E1216 15:53:53.510106 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:54:08 crc kubenswrapper[4728]: I1216 15:54:08.507071 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:54:08 crc kubenswrapper[4728]: E1216 15:54:08.508149 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 15:54:20 crc kubenswrapper[4728]: I1216 15:54:20.507161 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:54:21 crc kubenswrapper[4728]: I1216 15:54:21.300786 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"6057ec8bf018ddc636f1fdb8a788ff08f7c76a9cc99e3eea7068955f5d92eec4"} Dec 16 15:54:45 crc kubenswrapper[4728]: I1216 15:54:45.547399 4728 generic.go:334] "Generic (PLEG): container finished" podID="78bce531-8ad9-43f3-9d5a-2edaf2df712f" containerID="a38f3b5a8143d25704560897243d677a01d82a97f53d1026c4eb459780541662" exitCode=0 Dec 16 15:54:45 crc kubenswrapper[4728]: I1216 15:54:45.547450 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"78bce531-8ad9-43f3-9d5a-2edaf2df712f","Type":"ContainerDied","Data":"a38f3b5a8143d25704560897243d677a01d82a97f53d1026c4eb459780541662"} Dec 16 15:54:46 crc kubenswrapper[4728]: I1216 15:54:46.927197 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.058904 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ssh-key\") pod \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.059056 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config\") pod \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.059082 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-workdir\") pod \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.059192 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.059227 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config-secret\") pod \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.059247 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ca-certs\") pod \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.059288 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-config-data\") pod \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.059345 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-temporary\") pod \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.059386 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfcbt\" (UniqueName: \"kubernetes.io/projected/78bce531-8ad9-43f3-9d5a-2edaf2df712f-kube-api-access-jfcbt\") pod \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\" (UID: \"78bce531-8ad9-43f3-9d5a-2edaf2df712f\") " Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.060262 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "78bce531-8ad9-43f3-9d5a-2edaf2df712f" (UID: "78bce531-8ad9-43f3-9d5a-2edaf2df712f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.060621 4728 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.060387 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-config-data" (OuterVolumeSpecName: "config-data") pod "78bce531-8ad9-43f3-9d5a-2edaf2df712f" (UID: "78bce531-8ad9-43f3-9d5a-2edaf2df712f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.065581 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78bce531-8ad9-43f3-9d5a-2edaf2df712f-kube-api-access-jfcbt" (OuterVolumeSpecName: "kube-api-access-jfcbt") pod "78bce531-8ad9-43f3-9d5a-2edaf2df712f" (UID: "78bce531-8ad9-43f3-9d5a-2edaf2df712f"). InnerVolumeSpecName "kube-api-access-jfcbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.068016 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "78bce531-8ad9-43f3-9d5a-2edaf2df712f" (UID: "78bce531-8ad9-43f3-9d5a-2edaf2df712f"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.069270 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "78bce531-8ad9-43f3-9d5a-2edaf2df712f" (UID: "78bce531-8ad9-43f3-9d5a-2edaf2df712f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.089302 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "78bce531-8ad9-43f3-9d5a-2edaf2df712f" (UID: "78bce531-8ad9-43f3-9d5a-2edaf2df712f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.094080 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "78bce531-8ad9-43f3-9d5a-2edaf2df712f" (UID: "78bce531-8ad9-43f3-9d5a-2edaf2df712f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.101557 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "78bce531-8ad9-43f3-9d5a-2edaf2df712f" (UID: "78bce531-8ad9-43f3-9d5a-2edaf2df712f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.114971 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "78bce531-8ad9-43f3-9d5a-2edaf2df712f" (UID: "78bce531-8ad9-43f3-9d5a-2edaf2df712f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.162453 4728 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.162494 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.162510 4728 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/78bce531-8ad9-43f3-9d5a-2edaf2df712f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.162552 4728 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.162566 4728 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.162579 4728 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/78bce531-8ad9-43f3-9d5a-2edaf2df712f-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.162589 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78bce531-8ad9-43f3-9d5a-2edaf2df712f-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.162600 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfcbt\" (UniqueName: \"kubernetes.io/projected/78bce531-8ad9-43f3-9d5a-2edaf2df712f-kube-api-access-jfcbt\") on node \"crc\" DevicePath \"\"" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.220286 4728 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.264763 4728 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.567145 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"78bce531-8ad9-43f3-9d5a-2edaf2df712f","Type":"ContainerDied","Data":"f8eb5a440334c4b5c78cb6c24fe38dba8897cfda6e69abc296da1b49c0fdb562"} Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.567188 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8eb5a440334c4b5c78cb6c24fe38dba8897cfda6e69abc296da1b49c0fdb562" Dec 16 15:54:47 crc kubenswrapper[4728]: I1216 15:54:47.567263 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 16 15:54:50 crc kubenswrapper[4728]: I1216 15:54:50.976359 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 15:54:50 crc kubenswrapper[4728]: E1216 15:54:50.977745 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78bce531-8ad9-43f3-9d5a-2edaf2df712f" containerName="tempest-tests-tempest-tests-runner" Dec 16 15:54:50 crc kubenswrapper[4728]: I1216 15:54:50.977782 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bce531-8ad9-43f3-9d5a-2edaf2df712f" containerName="tempest-tests-tempest-tests-runner" Dec 16 15:54:50 crc kubenswrapper[4728]: E1216 15:54:50.977822 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b75b38-f1a1-42ca-a68d-28eded69163f" containerName="registry-server" Dec 16 15:54:50 crc kubenswrapper[4728]: I1216 15:54:50.977840 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b75b38-f1a1-42ca-a68d-28eded69163f" containerName="registry-server" Dec 16 15:54:50 crc kubenswrapper[4728]: E1216 15:54:50.977883 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b75b38-f1a1-42ca-a68d-28eded69163f" containerName="extract-content" Dec 16 15:54:50 crc kubenswrapper[4728]: I1216 15:54:50.977900 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b75b38-f1a1-42ca-a68d-28eded69163f" containerName="extract-content" Dec 16 15:54:50 crc kubenswrapper[4728]: E1216 15:54:50.977942 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b75b38-f1a1-42ca-a68d-28eded69163f" containerName="extract-utilities" Dec 16 15:54:50 crc kubenswrapper[4728]: I1216 15:54:50.977954 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b75b38-f1a1-42ca-a68d-28eded69163f" containerName="extract-utilities" Dec 16 15:54:50 crc kubenswrapper[4728]: I1216 15:54:50.978264 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b75b38-f1a1-42ca-a68d-28eded69163f" containerName="registry-server" Dec 16 15:54:50 crc kubenswrapper[4728]: I1216 15:54:50.978311 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="78bce531-8ad9-43f3-9d5a-2edaf2df712f" containerName="tempest-tests-tempest-tests-runner" Dec 16 15:54:50 crc kubenswrapper[4728]: I1216 15:54:50.979339 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:54:50 crc kubenswrapper[4728]: I1216 15:54:50.987030 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-sbfdl" Dec 16 15:54:50 crc kubenswrapper[4728]: I1216 15:54:50.994456 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 15:54:51 crc kubenswrapper[4728]: I1216 15:54:51.155711 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da534e2d-cb12-451d-b5bf-16b7943c82bb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:54:51 crc kubenswrapper[4728]: I1216 15:54:51.156008 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xlmh\" (UniqueName: \"kubernetes.io/projected/da534e2d-cb12-451d-b5bf-16b7943c82bb-kube-api-access-8xlmh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da534e2d-cb12-451d-b5bf-16b7943c82bb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:54:51 crc kubenswrapper[4728]: I1216 15:54:51.257753 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlmh\" (UniqueName: \"kubernetes.io/projected/da534e2d-cb12-451d-b5bf-16b7943c82bb-kube-api-access-8xlmh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da534e2d-cb12-451d-b5bf-16b7943c82bb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:54:51 crc kubenswrapper[4728]: I1216 15:54:51.257870 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da534e2d-cb12-451d-b5bf-16b7943c82bb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:54:51 crc kubenswrapper[4728]: I1216 15:54:51.258791 4728 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da534e2d-cb12-451d-b5bf-16b7943c82bb\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:54:51 crc kubenswrapper[4728]: I1216 15:54:51.282003 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xlmh\" (UniqueName: \"kubernetes.io/projected/da534e2d-cb12-451d-b5bf-16b7943c82bb-kube-api-access-8xlmh\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da534e2d-cb12-451d-b5bf-16b7943c82bb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:54:51 crc kubenswrapper[4728]: I1216 15:54:51.310054 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da534e2d-cb12-451d-b5bf-16b7943c82bb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:54:51 crc kubenswrapper[4728]: I1216 15:54:51.606304 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 16 15:54:52 crc kubenswrapper[4728]: I1216 15:54:52.189173 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 16 15:54:52 crc kubenswrapper[4728]: I1216 15:54:52.628237 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"da534e2d-cb12-451d-b5bf-16b7943c82bb","Type":"ContainerStarted","Data":"eed56076d11fa308770dc6b20f6790f0496c193712f8141f896028058150220b"} Dec 16 15:54:54 crc kubenswrapper[4728]: I1216 15:54:54.648590 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"da534e2d-cb12-451d-b5bf-16b7943c82bb","Type":"ContainerStarted","Data":"c74c97ef88a59915b69012c3bf708b9296bd14940e38839b98a5d4496d0255fc"} Dec 16 15:54:54 crc kubenswrapper[4728]: I1216 15:54:54.664440 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.676072055 podStartE2EDuration="4.664400386s" podCreationTimestamp="2025-12-16 15:54:50 +0000 UTC" firstStartedPulling="2025-12-16 15:54:52.191942141 +0000 UTC m=+3473.032121155" lastFinishedPulling="2025-12-16 15:54:54.180270502 +0000 UTC m=+3475.020449486" observedRunningTime="2025-12-16 15:54:54.66119722 +0000 UTC m=+3475.501376194" watchObservedRunningTime="2025-12-16 15:54:54.664400386 +0000 UTC m=+3475.504579370" Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.516193 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vqkxc"] Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.518949 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.536807 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqkxc"] Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.695844 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m7vj\" (UniqueName: \"kubernetes.io/projected/abd363b3-5ef8-4890-886c-26180fe7cd78-kube-api-access-8m7vj\") pod \"certified-operators-vqkxc\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.696331 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-utilities\") pod \"certified-operators-vqkxc\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.696489 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-catalog-content\") pod \"certified-operators-vqkxc\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.798289 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m7vj\" (UniqueName: \"kubernetes.io/projected/abd363b3-5ef8-4890-886c-26180fe7cd78-kube-api-access-8m7vj\") pod \"certified-operators-vqkxc\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.799007 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-utilities\") pod \"certified-operators-vqkxc\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.799070 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-catalog-content\") pod \"certified-operators-vqkxc\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.799643 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-catalog-content\") pod \"certified-operators-vqkxc\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.799756 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-utilities\") pod \"certified-operators-vqkxc\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.817889 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m7vj\" (UniqueName: \"kubernetes.io/projected/abd363b3-5ef8-4890-886c-26180fe7cd78-kube-api-access-8m7vj\") pod \"certified-operators-vqkxc\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:16 crc kubenswrapper[4728]: I1216 15:55:16.839316 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:17 crc kubenswrapper[4728]: I1216 15:55:17.445691 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqkxc"] Dec 16 15:55:17 crc kubenswrapper[4728]: I1216 15:55:17.841453 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9rv49/must-gather-g26v5"] Dec 16 15:55:17 crc kubenswrapper[4728]: I1216 15:55:17.843929 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/must-gather-g26v5" Dec 16 15:55:17 crc kubenswrapper[4728]: I1216 15:55:17.849565 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9rv49"/"default-dockercfg-z8m8r" Dec 16 15:55:17 crc kubenswrapper[4728]: I1216 15:55:17.852443 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9rv49"/"kube-root-ca.crt" Dec 16 15:55:17 crc kubenswrapper[4728]: I1216 15:55:17.852995 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9rv49"/"openshift-service-ca.crt" Dec 16 15:55:17 crc kubenswrapper[4728]: I1216 15:55:17.855602 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9rv49/must-gather-g26v5"] Dec 16 15:55:17 crc kubenswrapper[4728]: I1216 15:55:17.885487 4728 generic.go:334] "Generic (PLEG): container finished" podID="abd363b3-5ef8-4890-886c-26180fe7cd78" containerID="cb85d32c39e4b6abf6a38fe798527850a3254f630e1912489207d84c85b2210f" exitCode=0 Dec 16 15:55:17 crc kubenswrapper[4728]: I1216 15:55:17.885538 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqkxc" event={"ID":"abd363b3-5ef8-4890-886c-26180fe7cd78","Type":"ContainerDied","Data":"cb85d32c39e4b6abf6a38fe798527850a3254f630e1912489207d84c85b2210f"} Dec 16 15:55:17 crc kubenswrapper[4728]: I1216 15:55:17.885561 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqkxc" event={"ID":"abd363b3-5ef8-4890-886c-26180fe7cd78","Type":"ContainerStarted","Data":"7e1d70c8c24b979c6b912b0fa20a471108b24dccde78b4f30dfc69d43e5ef7ce"} Dec 16 15:55:17 crc kubenswrapper[4728]: I1216 15:55:17.891049 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 15:55:18 crc kubenswrapper[4728]: I1216 15:55:18.030371 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/745dd14c-6645-41dd-a220-c45f310237d4-must-gather-output\") pod \"must-gather-g26v5\" (UID: \"745dd14c-6645-41dd-a220-c45f310237d4\") " pod="openshift-must-gather-9rv49/must-gather-g26v5" Dec 16 15:55:18 crc kubenswrapper[4728]: I1216 15:55:18.030634 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7zl5\" (UniqueName: \"kubernetes.io/projected/745dd14c-6645-41dd-a220-c45f310237d4-kube-api-access-t7zl5\") pod \"must-gather-g26v5\" (UID: \"745dd14c-6645-41dd-a220-c45f310237d4\") " pod="openshift-must-gather-9rv49/must-gather-g26v5" Dec 16 15:55:18 crc kubenswrapper[4728]: I1216 15:55:18.132658 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7zl5\" (UniqueName: \"kubernetes.io/projected/745dd14c-6645-41dd-a220-c45f310237d4-kube-api-access-t7zl5\") pod \"must-gather-g26v5\" (UID: \"745dd14c-6645-41dd-a220-c45f310237d4\") " pod="openshift-must-gather-9rv49/must-gather-g26v5" Dec 16 15:55:18 crc kubenswrapper[4728]: I1216 15:55:18.132718 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/745dd14c-6645-41dd-a220-c45f310237d4-must-gather-output\") pod \"must-gather-g26v5\" (UID: \"745dd14c-6645-41dd-a220-c45f310237d4\") " pod="openshift-must-gather-9rv49/must-gather-g26v5" Dec 16 15:55:18 crc kubenswrapper[4728]: I1216 15:55:18.133198 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/745dd14c-6645-41dd-a220-c45f310237d4-must-gather-output\") pod \"must-gather-g26v5\" (UID: \"745dd14c-6645-41dd-a220-c45f310237d4\") " pod="openshift-must-gather-9rv49/must-gather-g26v5" Dec 16 15:55:18 crc kubenswrapper[4728]: I1216 15:55:18.165305 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7zl5\" (UniqueName: \"kubernetes.io/projected/745dd14c-6645-41dd-a220-c45f310237d4-kube-api-access-t7zl5\") pod \"must-gather-g26v5\" (UID: \"745dd14c-6645-41dd-a220-c45f310237d4\") " pod="openshift-must-gather-9rv49/must-gather-g26v5" Dec 16 15:55:18 crc kubenswrapper[4728]: I1216 15:55:18.165686 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/must-gather-g26v5" Dec 16 15:55:18 crc kubenswrapper[4728]: I1216 15:55:18.601082 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9rv49/must-gather-g26v5"] Dec 16 15:55:18 crc kubenswrapper[4728]: I1216 15:55:18.896628 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqkxc" event={"ID":"abd363b3-5ef8-4890-886c-26180fe7cd78","Type":"ContainerStarted","Data":"06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f"} Dec 16 15:55:18 crc kubenswrapper[4728]: I1216 15:55:18.898539 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/must-gather-g26v5" event={"ID":"745dd14c-6645-41dd-a220-c45f310237d4","Type":"ContainerStarted","Data":"5644c8b079bb1c41085a8329d689dd922f7c712ef83e5d60958e04db9651c757"} Dec 16 15:55:19 crc kubenswrapper[4728]: I1216 15:55:19.912244 4728 generic.go:334] "Generic (PLEG): container finished" podID="abd363b3-5ef8-4890-886c-26180fe7cd78" containerID="06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f" exitCode=0 Dec 16 15:55:19 crc kubenswrapper[4728]: I1216 15:55:19.912865 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqkxc" event={"ID":"abd363b3-5ef8-4890-886c-26180fe7cd78","Type":"ContainerDied","Data":"06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f"} Dec 16 15:55:20 crc kubenswrapper[4728]: I1216 15:55:20.930803 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqkxc" event={"ID":"abd363b3-5ef8-4890-886c-26180fe7cd78","Type":"ContainerStarted","Data":"2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45"} Dec 16 15:55:20 crc kubenswrapper[4728]: I1216 15:55:20.957589 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vqkxc" podStartSLOduration=2.480253232 podStartE2EDuration="4.957569109s" podCreationTimestamp="2025-12-16 15:55:16 +0000 UTC" firstStartedPulling="2025-12-16 15:55:17.890727413 +0000 UTC m=+3498.730906397" lastFinishedPulling="2025-12-16 15:55:20.36804329 +0000 UTC m=+3501.208222274" observedRunningTime="2025-12-16 15:55:20.946615843 +0000 UTC m=+3501.786794827" watchObservedRunningTime="2025-12-16 15:55:20.957569109 +0000 UTC m=+3501.797748093" Dec 16 15:55:26 crc kubenswrapper[4728]: I1216 15:55:26.839608 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:26 crc kubenswrapper[4728]: I1216 15:55:26.840201 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:26 crc kubenswrapper[4728]: I1216 15:55:26.891520 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:27 crc kubenswrapper[4728]: I1216 15:55:27.044947 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:27 crc kubenswrapper[4728]: I1216 15:55:27.148205 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqkxc"] Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.005397 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/must-gather-g26v5" event={"ID":"745dd14c-6645-41dd-a220-c45f310237d4","Type":"ContainerStarted","Data":"15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b"} Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.006192 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/must-gather-g26v5" event={"ID":"745dd14c-6645-41dd-a220-c45f310237d4","Type":"ContainerStarted","Data":"ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810"} Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.005530 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vqkxc" podUID="abd363b3-5ef8-4890-886c-26180fe7cd78" containerName="registry-server" containerID="cri-o://2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45" gracePeriod=2 Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.029037 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9rv49/must-gather-g26v5" podStartSLOduration=2.572577068 podStartE2EDuration="12.029018629s" podCreationTimestamp="2025-12-16 15:55:17 +0000 UTC" firstStartedPulling="2025-12-16 15:55:18.609893783 +0000 UTC m=+3499.450072767" lastFinishedPulling="2025-12-16 15:55:28.066335344 +0000 UTC m=+3508.906514328" observedRunningTime="2025-12-16 15:55:29.026767288 +0000 UTC m=+3509.866946272" watchObservedRunningTime="2025-12-16 15:55:29.029018629 +0000 UTC m=+3509.869197613" Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.495827 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.663923 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-catalog-content\") pod \"abd363b3-5ef8-4890-886c-26180fe7cd78\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.664088 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m7vj\" (UniqueName: \"kubernetes.io/projected/abd363b3-5ef8-4890-886c-26180fe7cd78-kube-api-access-8m7vj\") pod \"abd363b3-5ef8-4890-886c-26180fe7cd78\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.664185 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-utilities\") pod \"abd363b3-5ef8-4890-886c-26180fe7cd78\" (UID: \"abd363b3-5ef8-4890-886c-26180fe7cd78\") " Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.665136 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-utilities" (OuterVolumeSpecName: "utilities") pod "abd363b3-5ef8-4890-886c-26180fe7cd78" (UID: "abd363b3-5ef8-4890-886c-26180fe7cd78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.669759 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abd363b3-5ef8-4890-886c-26180fe7cd78-kube-api-access-8m7vj" (OuterVolumeSpecName: "kube-api-access-8m7vj") pod "abd363b3-5ef8-4890-886c-26180fe7cd78" (UID: "abd363b3-5ef8-4890-886c-26180fe7cd78"). InnerVolumeSpecName "kube-api-access-8m7vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.727909 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abd363b3-5ef8-4890-886c-26180fe7cd78" (UID: "abd363b3-5ef8-4890-886c-26180fe7cd78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.766254 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m7vj\" (UniqueName: \"kubernetes.io/projected/abd363b3-5ef8-4890-886c-26180fe7cd78-kube-api-access-8m7vj\") on node \"crc\" DevicePath \"\"" Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.766524 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 15:55:29 crc kubenswrapper[4728]: I1216 15:55:29.766595 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abd363b3-5ef8-4890-886c-26180fe7cd78-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.019381 4728 generic.go:334] "Generic (PLEG): container finished" podID="abd363b3-5ef8-4890-886c-26180fe7cd78" containerID="2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45" exitCode=0 Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.019463 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqkxc" Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.019561 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqkxc" event={"ID":"abd363b3-5ef8-4890-886c-26180fe7cd78","Type":"ContainerDied","Data":"2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45"} Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.019668 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqkxc" event={"ID":"abd363b3-5ef8-4890-886c-26180fe7cd78","Type":"ContainerDied","Data":"7e1d70c8c24b979c6b912b0fa20a471108b24dccde78b4f30dfc69d43e5ef7ce"} Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.019701 4728 scope.go:117] "RemoveContainer" containerID="2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45" Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.055664 4728 scope.go:117] "RemoveContainer" containerID="06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f" Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.069649 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqkxc"] Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.082126 4728 scope.go:117] "RemoveContainer" containerID="cb85d32c39e4b6abf6a38fe798527850a3254f630e1912489207d84c85b2210f" Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.087129 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vqkxc"] Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.122911 4728 scope.go:117] "RemoveContainer" containerID="2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45" Dec 16 15:55:30 crc kubenswrapper[4728]: E1216 15:55:30.123483 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45\": container with ID starting with 2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45 not found: ID does not exist" containerID="2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45" Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.123517 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45"} err="failed to get container status \"2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45\": rpc error: code = NotFound desc = could not find container \"2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45\": container with ID starting with 2f4efd30c5836dc00c17c96a50f4988246145da656d701afde3f2b540d60fb45 not found: ID does not exist" Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.123539 4728 scope.go:117] "RemoveContainer" containerID="06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f" Dec 16 15:55:30 crc kubenswrapper[4728]: E1216 15:55:30.123928 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f\": container with ID starting with 06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f not found: ID does not exist" containerID="06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f" Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.123973 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f"} err="failed to get container status \"06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f\": rpc error: code = NotFound desc = could not find container \"06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f\": container with ID starting with 06dbc48400d82484dfeb24a8a38b746dec7bdbf14f8f5c492f1d3cee66f3c17f not found: ID does not exist" Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.124003 4728 scope.go:117] "RemoveContainer" containerID="cb85d32c39e4b6abf6a38fe798527850a3254f630e1912489207d84c85b2210f" Dec 16 15:55:30 crc kubenswrapper[4728]: E1216 15:55:30.124391 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb85d32c39e4b6abf6a38fe798527850a3254f630e1912489207d84c85b2210f\": container with ID starting with cb85d32c39e4b6abf6a38fe798527850a3254f630e1912489207d84c85b2210f not found: ID does not exist" containerID="cb85d32c39e4b6abf6a38fe798527850a3254f630e1912489207d84c85b2210f" Dec 16 15:55:30 crc kubenswrapper[4728]: I1216 15:55:30.124430 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb85d32c39e4b6abf6a38fe798527850a3254f630e1912489207d84c85b2210f"} err="failed to get container status \"cb85d32c39e4b6abf6a38fe798527850a3254f630e1912489207d84c85b2210f\": rpc error: code = NotFound desc = could not find container \"cb85d32c39e4b6abf6a38fe798527850a3254f630e1912489207d84c85b2210f\": container with ID starting with cb85d32c39e4b6abf6a38fe798527850a3254f630e1912489207d84c85b2210f not found: ID does not exist" Dec 16 15:55:31 crc kubenswrapper[4728]: I1216 15:55:31.517468 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abd363b3-5ef8-4890-886c-26180fe7cd78" path="/var/lib/kubelet/pods/abd363b3-5ef8-4890-886c-26180fe7cd78/volumes" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.077900 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9rv49/crc-debug-gfcpr"] Dec 16 15:55:32 crc kubenswrapper[4728]: E1216 15:55:32.078478 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd363b3-5ef8-4890-886c-26180fe7cd78" containerName="extract-content" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.078491 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd363b3-5ef8-4890-886c-26180fe7cd78" containerName="extract-content" Dec 16 15:55:32 crc kubenswrapper[4728]: E1216 15:55:32.078508 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd363b3-5ef8-4890-886c-26180fe7cd78" containerName="extract-utilities" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.078514 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd363b3-5ef8-4890-886c-26180fe7cd78" containerName="extract-utilities" Dec 16 15:55:32 crc kubenswrapper[4728]: E1216 15:55:32.078540 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd363b3-5ef8-4890-886c-26180fe7cd78" containerName="registry-server" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.078548 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd363b3-5ef8-4890-886c-26180fe7cd78" containerName="registry-server" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.078720 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="abd363b3-5ef8-4890-886c-26180fe7cd78" containerName="registry-server" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.079262 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-gfcpr" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.209916 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c8c69db-fec2-4851-80fd-43e1efcf46d2-host\") pod \"crc-debug-gfcpr\" (UID: \"9c8c69db-fec2-4851-80fd-43e1efcf46d2\") " pod="openshift-must-gather-9rv49/crc-debug-gfcpr" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.210044 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4h7q\" (UniqueName: \"kubernetes.io/projected/9c8c69db-fec2-4851-80fd-43e1efcf46d2-kube-api-access-p4h7q\") pod \"crc-debug-gfcpr\" (UID: \"9c8c69db-fec2-4851-80fd-43e1efcf46d2\") " pod="openshift-must-gather-9rv49/crc-debug-gfcpr" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.312024 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4h7q\" (UniqueName: \"kubernetes.io/projected/9c8c69db-fec2-4851-80fd-43e1efcf46d2-kube-api-access-p4h7q\") pod \"crc-debug-gfcpr\" (UID: \"9c8c69db-fec2-4851-80fd-43e1efcf46d2\") " pod="openshift-must-gather-9rv49/crc-debug-gfcpr" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.312221 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c8c69db-fec2-4851-80fd-43e1efcf46d2-host\") pod \"crc-debug-gfcpr\" (UID: \"9c8c69db-fec2-4851-80fd-43e1efcf46d2\") " pod="openshift-must-gather-9rv49/crc-debug-gfcpr" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.312391 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c8c69db-fec2-4851-80fd-43e1efcf46d2-host\") pod \"crc-debug-gfcpr\" (UID: \"9c8c69db-fec2-4851-80fd-43e1efcf46d2\") " pod="openshift-must-gather-9rv49/crc-debug-gfcpr" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.339603 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4h7q\" (UniqueName: \"kubernetes.io/projected/9c8c69db-fec2-4851-80fd-43e1efcf46d2-kube-api-access-p4h7q\") pod \"crc-debug-gfcpr\" (UID: \"9c8c69db-fec2-4851-80fd-43e1efcf46d2\") " pod="openshift-must-gather-9rv49/crc-debug-gfcpr" Dec 16 15:55:32 crc kubenswrapper[4728]: I1216 15:55:32.394609 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-gfcpr" Dec 16 15:55:32 crc kubenswrapper[4728]: W1216 15:55:32.428821 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c8c69db_fec2_4851_80fd_43e1efcf46d2.slice/crio-c2867f253fd4d2aada93d1bab41df638749b96393e5e1bbef3e873b103d806f5 WatchSource:0}: Error finding container c2867f253fd4d2aada93d1bab41df638749b96393e5e1bbef3e873b103d806f5: Status 404 returned error can't find the container with id c2867f253fd4d2aada93d1bab41df638749b96393e5e1bbef3e873b103d806f5 Dec 16 15:55:33 crc kubenswrapper[4728]: I1216 15:55:33.051670 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/crc-debug-gfcpr" event={"ID":"9c8c69db-fec2-4851-80fd-43e1efcf46d2","Type":"ContainerStarted","Data":"c2867f253fd4d2aada93d1bab41df638749b96393e5e1bbef3e873b103d806f5"} Dec 16 15:55:44 crc kubenswrapper[4728]: I1216 15:55:44.153084 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/crc-debug-gfcpr" event={"ID":"9c8c69db-fec2-4851-80fd-43e1efcf46d2","Type":"ContainerStarted","Data":"856234ebb91f2c39c79290797a29772648b2ec5369579d5ffde06566c367b021"} Dec 16 15:55:44 crc kubenswrapper[4728]: I1216 15:55:44.183998 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9rv49/crc-debug-gfcpr" podStartSLOduration=1.533698372 podStartE2EDuration="12.183972559s" podCreationTimestamp="2025-12-16 15:55:32 +0000 UTC" firstStartedPulling="2025-12-16 15:55:32.431398716 +0000 UTC m=+3513.271577690" lastFinishedPulling="2025-12-16 15:55:43.081672893 +0000 UTC m=+3523.921851877" observedRunningTime="2025-12-16 15:55:44.179677343 +0000 UTC m=+3525.019856327" watchObservedRunningTime="2025-12-16 15:55:44.183972559 +0000 UTC m=+3525.024151553" Dec 16 15:56:22 crc kubenswrapper[4728]: I1216 15:56:22.494280 4728 generic.go:334] "Generic (PLEG): container finished" podID="9c8c69db-fec2-4851-80fd-43e1efcf46d2" containerID="856234ebb91f2c39c79290797a29772648b2ec5369579d5ffde06566c367b021" exitCode=0 Dec 16 15:56:22 crc kubenswrapper[4728]: I1216 15:56:22.494360 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/crc-debug-gfcpr" event={"ID":"9c8c69db-fec2-4851-80fd-43e1efcf46d2","Type":"ContainerDied","Data":"856234ebb91f2c39c79290797a29772648b2ec5369579d5ffde06566c367b021"} Dec 16 15:56:23 crc kubenswrapper[4728]: I1216 15:56:23.630006 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-gfcpr" Dec 16 15:56:23 crc kubenswrapper[4728]: I1216 15:56:23.691904 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9rv49/crc-debug-gfcpr"] Dec 16 15:56:23 crc kubenswrapper[4728]: I1216 15:56:23.705491 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9rv49/crc-debug-gfcpr"] Dec 16 15:56:23 crc kubenswrapper[4728]: I1216 15:56:23.733293 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4h7q\" (UniqueName: \"kubernetes.io/projected/9c8c69db-fec2-4851-80fd-43e1efcf46d2-kube-api-access-p4h7q\") pod \"9c8c69db-fec2-4851-80fd-43e1efcf46d2\" (UID: \"9c8c69db-fec2-4851-80fd-43e1efcf46d2\") " Dec 16 15:56:23 crc kubenswrapper[4728]: I1216 15:56:23.733585 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c8c69db-fec2-4851-80fd-43e1efcf46d2-host\") pod \"9c8c69db-fec2-4851-80fd-43e1efcf46d2\" (UID: \"9c8c69db-fec2-4851-80fd-43e1efcf46d2\") " Dec 16 15:56:23 crc kubenswrapper[4728]: I1216 15:56:23.733809 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c8c69db-fec2-4851-80fd-43e1efcf46d2-host" (OuterVolumeSpecName: "host") pod "9c8c69db-fec2-4851-80fd-43e1efcf46d2" (UID: "9c8c69db-fec2-4851-80fd-43e1efcf46d2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:56:23 crc kubenswrapper[4728]: I1216 15:56:23.734203 4728 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c8c69db-fec2-4851-80fd-43e1efcf46d2-host\") on node \"crc\" DevicePath \"\"" Dec 16 15:56:23 crc kubenswrapper[4728]: I1216 15:56:23.745641 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c8c69db-fec2-4851-80fd-43e1efcf46d2-kube-api-access-p4h7q" (OuterVolumeSpecName: "kube-api-access-p4h7q") pod "9c8c69db-fec2-4851-80fd-43e1efcf46d2" (UID: "9c8c69db-fec2-4851-80fd-43e1efcf46d2"). InnerVolumeSpecName "kube-api-access-p4h7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:56:23 crc kubenswrapper[4728]: I1216 15:56:23.836002 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4h7q\" (UniqueName: \"kubernetes.io/projected/9c8c69db-fec2-4851-80fd-43e1efcf46d2-kube-api-access-p4h7q\") on node \"crc\" DevicePath \"\"" Dec 16 15:56:24 crc kubenswrapper[4728]: I1216 15:56:24.523021 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2867f253fd4d2aada93d1bab41df638749b96393e5e1bbef3e873b103d806f5" Dec 16 15:56:24 crc kubenswrapper[4728]: I1216 15:56:24.523206 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-gfcpr" Dec 16 15:56:24 crc kubenswrapper[4728]: I1216 15:56:24.832637 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9rv49/crc-debug-zrxlw"] Dec 16 15:56:24 crc kubenswrapper[4728]: E1216 15:56:24.833056 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c8c69db-fec2-4851-80fd-43e1efcf46d2" containerName="container-00" Dec 16 15:56:24 crc kubenswrapper[4728]: I1216 15:56:24.833071 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c8c69db-fec2-4851-80fd-43e1efcf46d2" containerName="container-00" Dec 16 15:56:24 crc kubenswrapper[4728]: I1216 15:56:24.833302 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c8c69db-fec2-4851-80fd-43e1efcf46d2" containerName="container-00" Dec 16 15:56:24 crc kubenswrapper[4728]: I1216 15:56:24.833989 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-zrxlw" Dec 16 15:56:24 crc kubenswrapper[4728]: I1216 15:56:24.960601 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmk66\" (UniqueName: \"kubernetes.io/projected/ec150f37-8538-42e8-9e8e-a17d33a0284a-kube-api-access-wmk66\") pod \"crc-debug-zrxlw\" (UID: \"ec150f37-8538-42e8-9e8e-a17d33a0284a\") " pod="openshift-must-gather-9rv49/crc-debug-zrxlw" Dec 16 15:56:24 crc kubenswrapper[4728]: I1216 15:56:24.960742 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec150f37-8538-42e8-9e8e-a17d33a0284a-host\") pod \"crc-debug-zrxlw\" (UID: \"ec150f37-8538-42e8-9e8e-a17d33a0284a\") " pod="openshift-must-gather-9rv49/crc-debug-zrxlw" Dec 16 15:56:25 crc kubenswrapper[4728]: I1216 15:56:25.062610 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmk66\" (UniqueName: \"kubernetes.io/projected/ec150f37-8538-42e8-9e8e-a17d33a0284a-kube-api-access-wmk66\") pod \"crc-debug-zrxlw\" (UID: \"ec150f37-8538-42e8-9e8e-a17d33a0284a\") " pod="openshift-must-gather-9rv49/crc-debug-zrxlw" Dec 16 15:56:25 crc kubenswrapper[4728]: I1216 15:56:25.062798 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec150f37-8538-42e8-9e8e-a17d33a0284a-host\") pod \"crc-debug-zrxlw\" (UID: \"ec150f37-8538-42e8-9e8e-a17d33a0284a\") " pod="openshift-must-gather-9rv49/crc-debug-zrxlw" Dec 16 15:56:25 crc kubenswrapper[4728]: I1216 15:56:25.062975 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec150f37-8538-42e8-9e8e-a17d33a0284a-host\") pod \"crc-debug-zrxlw\" (UID: \"ec150f37-8538-42e8-9e8e-a17d33a0284a\") " pod="openshift-must-gather-9rv49/crc-debug-zrxlw" Dec 16 15:56:25 crc kubenswrapper[4728]: I1216 15:56:25.098078 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmk66\" (UniqueName: \"kubernetes.io/projected/ec150f37-8538-42e8-9e8e-a17d33a0284a-kube-api-access-wmk66\") pod \"crc-debug-zrxlw\" (UID: \"ec150f37-8538-42e8-9e8e-a17d33a0284a\") " pod="openshift-must-gather-9rv49/crc-debug-zrxlw" Dec 16 15:56:25 crc kubenswrapper[4728]: I1216 15:56:25.153351 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-zrxlw" Dec 16 15:56:25 crc kubenswrapper[4728]: I1216 15:56:25.522634 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c8c69db-fec2-4851-80fd-43e1efcf46d2" path="/var/lib/kubelet/pods/9c8c69db-fec2-4851-80fd-43e1efcf46d2/volumes" Dec 16 15:56:25 crc kubenswrapper[4728]: I1216 15:56:25.537443 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/crc-debug-zrxlw" event={"ID":"ec150f37-8538-42e8-9e8e-a17d33a0284a","Type":"ContainerStarted","Data":"c0a5d4926faa61d63e8b2631d3ca0e1af22d01ce26fca2c8e11c5fc68d2ea169"} Dec 16 15:56:25 crc kubenswrapper[4728]: I1216 15:56:25.537492 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/crc-debug-zrxlw" event={"ID":"ec150f37-8538-42e8-9e8e-a17d33a0284a","Type":"ContainerStarted","Data":"527eefb2af4df07e0b95df226676edc6b5ea567c2006cc1e655e5c4c09f43aad"} Dec 16 15:56:25 crc kubenswrapper[4728]: I1216 15:56:25.565932 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9rv49/crc-debug-zrxlw" podStartSLOduration=1.565912553 podStartE2EDuration="1.565912553s" podCreationTimestamp="2025-12-16 15:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 15:56:25.551268377 +0000 UTC m=+3566.391447361" watchObservedRunningTime="2025-12-16 15:56:25.565912553 +0000 UTC m=+3566.406091557" Dec 16 15:56:26 crc kubenswrapper[4728]: I1216 15:56:26.546376 4728 generic.go:334] "Generic (PLEG): container finished" podID="ec150f37-8538-42e8-9e8e-a17d33a0284a" containerID="c0a5d4926faa61d63e8b2631d3ca0e1af22d01ce26fca2c8e11c5fc68d2ea169" exitCode=0 Dec 16 15:56:26 crc kubenswrapper[4728]: I1216 15:56:26.546486 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/crc-debug-zrxlw" event={"ID":"ec150f37-8538-42e8-9e8e-a17d33a0284a","Type":"ContainerDied","Data":"c0a5d4926faa61d63e8b2631d3ca0e1af22d01ce26fca2c8e11c5fc68d2ea169"} Dec 16 15:56:27 crc kubenswrapper[4728]: I1216 15:56:27.698419 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-zrxlw" Dec 16 15:56:27 crc kubenswrapper[4728]: I1216 15:56:27.738788 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9rv49/crc-debug-zrxlw"] Dec 16 15:56:27 crc kubenswrapper[4728]: I1216 15:56:27.746784 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9rv49/crc-debug-zrxlw"] Dec 16 15:56:27 crc kubenswrapper[4728]: I1216 15:56:27.827711 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec150f37-8538-42e8-9e8e-a17d33a0284a-host\") pod \"ec150f37-8538-42e8-9e8e-a17d33a0284a\" (UID: \"ec150f37-8538-42e8-9e8e-a17d33a0284a\") " Dec 16 15:56:27 crc kubenswrapper[4728]: I1216 15:56:27.827890 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec150f37-8538-42e8-9e8e-a17d33a0284a-host" (OuterVolumeSpecName: "host") pod "ec150f37-8538-42e8-9e8e-a17d33a0284a" (UID: "ec150f37-8538-42e8-9e8e-a17d33a0284a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:56:27 crc kubenswrapper[4728]: I1216 15:56:27.827988 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmk66\" (UniqueName: \"kubernetes.io/projected/ec150f37-8538-42e8-9e8e-a17d33a0284a-kube-api-access-wmk66\") pod \"ec150f37-8538-42e8-9e8e-a17d33a0284a\" (UID: \"ec150f37-8538-42e8-9e8e-a17d33a0284a\") " Dec 16 15:56:27 crc kubenswrapper[4728]: I1216 15:56:27.828773 4728 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec150f37-8538-42e8-9e8e-a17d33a0284a-host\") on node \"crc\" DevicePath \"\"" Dec 16 15:56:27 crc kubenswrapper[4728]: I1216 15:56:27.832812 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec150f37-8538-42e8-9e8e-a17d33a0284a-kube-api-access-wmk66" (OuterVolumeSpecName: "kube-api-access-wmk66") pod "ec150f37-8538-42e8-9e8e-a17d33a0284a" (UID: "ec150f37-8538-42e8-9e8e-a17d33a0284a"). InnerVolumeSpecName "kube-api-access-wmk66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:56:27 crc kubenswrapper[4728]: I1216 15:56:27.931101 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmk66\" (UniqueName: \"kubernetes.io/projected/ec150f37-8538-42e8-9e8e-a17d33a0284a-kube-api-access-wmk66\") on node \"crc\" DevicePath \"\"" Dec 16 15:56:28 crc kubenswrapper[4728]: I1216 15:56:28.573193 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="527eefb2af4df07e0b95df226676edc6b5ea567c2006cc1e655e5c4c09f43aad" Dec 16 15:56:28 crc kubenswrapper[4728]: I1216 15:56:28.573493 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-zrxlw" Dec 16 15:56:28 crc kubenswrapper[4728]: I1216 15:56:28.971373 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9rv49/crc-debug-kzg77"] Dec 16 15:56:28 crc kubenswrapper[4728]: E1216 15:56:28.971801 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec150f37-8538-42e8-9e8e-a17d33a0284a" containerName="container-00" Dec 16 15:56:28 crc kubenswrapper[4728]: I1216 15:56:28.971817 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec150f37-8538-42e8-9e8e-a17d33a0284a" containerName="container-00" Dec 16 15:56:28 crc kubenswrapper[4728]: I1216 15:56:28.971997 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec150f37-8538-42e8-9e8e-a17d33a0284a" containerName="container-00" Dec 16 15:56:28 crc kubenswrapper[4728]: I1216 15:56:28.973064 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-kzg77" Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.157221 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16804427-3a97-4b04-9ff1-132371e8c396-host\") pod \"crc-debug-kzg77\" (UID: \"16804427-3a97-4b04-9ff1-132371e8c396\") " pod="openshift-must-gather-9rv49/crc-debug-kzg77" Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.157275 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5rmt\" (UniqueName: \"kubernetes.io/projected/16804427-3a97-4b04-9ff1-132371e8c396-kube-api-access-r5rmt\") pod \"crc-debug-kzg77\" (UID: \"16804427-3a97-4b04-9ff1-132371e8c396\") " pod="openshift-must-gather-9rv49/crc-debug-kzg77" Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.259021 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16804427-3a97-4b04-9ff1-132371e8c396-host\") pod \"crc-debug-kzg77\" (UID: \"16804427-3a97-4b04-9ff1-132371e8c396\") " pod="openshift-must-gather-9rv49/crc-debug-kzg77" Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.259100 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5rmt\" (UniqueName: \"kubernetes.io/projected/16804427-3a97-4b04-9ff1-132371e8c396-kube-api-access-r5rmt\") pod \"crc-debug-kzg77\" (UID: \"16804427-3a97-4b04-9ff1-132371e8c396\") " pod="openshift-must-gather-9rv49/crc-debug-kzg77" Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.259114 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16804427-3a97-4b04-9ff1-132371e8c396-host\") pod \"crc-debug-kzg77\" (UID: \"16804427-3a97-4b04-9ff1-132371e8c396\") " pod="openshift-must-gather-9rv49/crc-debug-kzg77" Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.278629 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5rmt\" (UniqueName: \"kubernetes.io/projected/16804427-3a97-4b04-9ff1-132371e8c396-kube-api-access-r5rmt\") pod \"crc-debug-kzg77\" (UID: \"16804427-3a97-4b04-9ff1-132371e8c396\") " pod="openshift-must-gather-9rv49/crc-debug-kzg77" Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.293508 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-kzg77" Dec 16 15:56:29 crc kubenswrapper[4728]: W1216 15:56:29.333764 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16804427_3a97_4b04_9ff1_132371e8c396.slice/crio-907f9771f2a04bd5d7000e0ee463adb25f832ac880e9a80a566fdce23cf479f9 WatchSource:0}: Error finding container 907f9771f2a04bd5d7000e0ee463adb25f832ac880e9a80a566fdce23cf479f9: Status 404 returned error can't find the container with id 907f9771f2a04bd5d7000e0ee463adb25f832ac880e9a80a566fdce23cf479f9 Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.518098 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec150f37-8538-42e8-9e8e-a17d33a0284a" path="/var/lib/kubelet/pods/ec150f37-8538-42e8-9e8e-a17d33a0284a/volumes" Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.584088 4728 generic.go:334] "Generic (PLEG): container finished" podID="16804427-3a97-4b04-9ff1-132371e8c396" containerID="3616763d562f6fcce593083bc31ebee25e8e3abda697c2beb6995718c689299d" exitCode=0 Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.584129 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/crc-debug-kzg77" event={"ID":"16804427-3a97-4b04-9ff1-132371e8c396","Type":"ContainerDied","Data":"3616763d562f6fcce593083bc31ebee25e8e3abda697c2beb6995718c689299d"} Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.584168 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/crc-debug-kzg77" event={"ID":"16804427-3a97-4b04-9ff1-132371e8c396","Type":"ContainerStarted","Data":"907f9771f2a04bd5d7000e0ee463adb25f832ac880e9a80a566fdce23cf479f9"} Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.638255 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9rv49/crc-debug-kzg77"] Dec 16 15:56:29 crc kubenswrapper[4728]: I1216 15:56:29.648056 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9rv49/crc-debug-kzg77"] Dec 16 15:56:30 crc kubenswrapper[4728]: I1216 15:56:30.724616 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-kzg77" Dec 16 15:56:30 crc kubenswrapper[4728]: I1216 15:56:30.886688 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5rmt\" (UniqueName: \"kubernetes.io/projected/16804427-3a97-4b04-9ff1-132371e8c396-kube-api-access-r5rmt\") pod \"16804427-3a97-4b04-9ff1-132371e8c396\" (UID: \"16804427-3a97-4b04-9ff1-132371e8c396\") " Dec 16 15:56:30 crc kubenswrapper[4728]: I1216 15:56:30.887141 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16804427-3a97-4b04-9ff1-132371e8c396-host\") pod \"16804427-3a97-4b04-9ff1-132371e8c396\" (UID: \"16804427-3a97-4b04-9ff1-132371e8c396\") " Dec 16 15:56:30 crc kubenswrapper[4728]: I1216 15:56:30.887232 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16804427-3a97-4b04-9ff1-132371e8c396-host" (OuterVolumeSpecName: "host") pod "16804427-3a97-4b04-9ff1-132371e8c396" (UID: "16804427-3a97-4b04-9ff1-132371e8c396"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 15:56:30 crc kubenswrapper[4728]: I1216 15:56:30.887576 4728 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/16804427-3a97-4b04-9ff1-132371e8c396-host\") on node \"crc\" DevicePath \"\"" Dec 16 15:56:30 crc kubenswrapper[4728]: I1216 15:56:30.892833 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16804427-3a97-4b04-9ff1-132371e8c396-kube-api-access-r5rmt" (OuterVolumeSpecName: "kube-api-access-r5rmt") pod "16804427-3a97-4b04-9ff1-132371e8c396" (UID: "16804427-3a97-4b04-9ff1-132371e8c396"). InnerVolumeSpecName "kube-api-access-r5rmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 15:56:30 crc kubenswrapper[4728]: I1216 15:56:30.989186 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5rmt\" (UniqueName: \"kubernetes.io/projected/16804427-3a97-4b04-9ff1-132371e8c396-kube-api-access-r5rmt\") on node \"crc\" DevicePath \"\"" Dec 16 15:56:31 crc kubenswrapper[4728]: I1216 15:56:31.518704 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16804427-3a97-4b04-9ff1-132371e8c396" path="/var/lib/kubelet/pods/16804427-3a97-4b04-9ff1-132371e8c396/volumes" Dec 16 15:56:31 crc kubenswrapper[4728]: I1216 15:56:31.605139 4728 scope.go:117] "RemoveContainer" containerID="3616763d562f6fcce593083bc31ebee25e8e3abda697c2beb6995718c689299d" Dec 16 15:56:31 crc kubenswrapper[4728]: I1216 15:56:31.605317 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/crc-debug-kzg77" Dec 16 15:56:38 crc kubenswrapper[4728]: I1216 15:56:38.818795 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:56:38 crc kubenswrapper[4728]: I1216 15:56:38.820551 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:56:45 crc kubenswrapper[4728]: I1216 15:56:45.380807 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-76fcd78578-bhff6_4589b3db-cca9-45d9-a576-71188fd26cd1/barbican-api/0.log" Dec 16 15:56:45 crc kubenswrapper[4728]: I1216 15:56:45.565126 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8957f9486-cds65_f3dd302c-4cb1-487b-9995-a99059ee9ac6/barbican-keystone-listener/0.log" Dec 16 15:56:45 crc kubenswrapper[4728]: I1216 15:56:45.593793 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-76fcd78578-bhff6_4589b3db-cca9-45d9-a576-71188fd26cd1/barbican-api-log/0.log" Dec 16 15:56:45 crc kubenswrapper[4728]: I1216 15:56:45.654536 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8957f9486-cds65_f3dd302c-4cb1-487b-9995-a99059ee9ac6/barbican-keystone-listener-log/0.log" Dec 16 15:56:45 crc kubenswrapper[4728]: I1216 15:56:45.780029 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86cff44659-k2jp2_e3e0ec72-0e84-444e-a66f-50b4fe91adb5/barbican-worker/0.log" Dec 16 15:56:45 crc kubenswrapper[4728]: I1216 15:56:45.845399 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86cff44659-k2jp2_e3e0ec72-0e84-444e-a66f-50b4fe91adb5/barbican-worker-log/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.030421 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr_801eb0fd-312d-4913-8608-52baf1c65fea/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.099019 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0da26377-a29a-4acb-8ef8-4d17c1431d0b/ceilometer-central-agent/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.153345 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0da26377-a29a-4acb-8ef8-4d17c1431d0b/ceilometer-notification-agent/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.252426 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0da26377-a29a-4acb-8ef8-4d17c1431d0b/sg-core/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.256500 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0da26377-a29a-4acb-8ef8-4d17c1431d0b/proxy-httpd/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.441751 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c3da261d-5106-45a2-a6c7-d5314450c0af/cinder-api/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.487515 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c3da261d-5106-45a2-a6c7-d5314450c0af/cinder-api-log/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.594214 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a95d0c5b-fcce-46ba-bfae-1b25bf1d10af/cinder-scheduler/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.636338 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a95d0c5b-fcce-46ba-bfae-1b25bf1d10af/probe/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.780157 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5_81acd27c-46ac-4132-9e15-6858289dbb7b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.901381 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2_342edac6-5fe9-45d6-9d37-2bc1ed959d23/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:46 crc kubenswrapper[4728]: I1216 15:56:46.983183 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-xsxxz_e4e2028c-f46c-4fd1-8dee-4fb4860de081/init/0.log" Dec 16 15:56:47 crc kubenswrapper[4728]: I1216 15:56:47.207951 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-xsxxz_e4e2028c-f46c-4fd1-8dee-4fb4860de081/init/0.log" Dec 16 15:56:47 crc kubenswrapper[4728]: I1216 15:56:47.214048 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-xsxxz_e4e2028c-f46c-4fd1-8dee-4fb4860de081/dnsmasq-dns/0.log" Dec 16 15:56:47 crc kubenswrapper[4728]: I1216 15:56:47.234895 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-jwww2_26b6262a-41a3-48c4-aba9-a54801be0a7c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:47 crc kubenswrapper[4728]: I1216 15:56:47.518026 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_453173c9-63a1-457e-bf01-dd45f194a238/glance-log/0.log" Dec 16 15:56:47 crc kubenswrapper[4728]: I1216 15:56:47.565504 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_453173c9-63a1-457e-bf01-dd45f194a238/glance-httpd/0.log" Dec 16 15:56:47 crc kubenswrapper[4728]: I1216 15:56:47.722016 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_db21c1bc-6a08-4948-8cea-5d5ee3ecd223/glance-httpd/0.log" Dec 16 15:56:47 crc kubenswrapper[4728]: I1216 15:56:47.808017 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_db21c1bc-6a08-4948-8cea-5d5ee3ecd223/glance-log/0.log" Dec 16 15:56:47 crc kubenswrapper[4728]: I1216 15:56:47.890871 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7585b44dcb-46w99_ac195fba-37cf-48a1-aa91-c9df824ddfe4/horizon/0.log" Dec 16 15:56:48 crc kubenswrapper[4728]: I1216 15:56:48.072752 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5_7238debe-2d46-40ca-b598-2011d69c375c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:48 crc kubenswrapper[4728]: I1216 15:56:48.316763 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7585b44dcb-46w99_ac195fba-37cf-48a1-aa91-c9df824ddfe4/horizon-log/0.log" Dec 16 15:56:48 crc kubenswrapper[4728]: I1216 15:56:48.333072 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-gsn6x_6f876743-6860-4e07-b8ed-d1cfcd92f2a7/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:48 crc kubenswrapper[4728]: I1216 15:56:48.531822 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5874cbd465-jjmn6_18996006-74fc-4090-941f-783741605f54/keystone-api/0.log" Dec 16 15:56:48 crc kubenswrapper[4728]: I1216 15:56:48.542053 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_dca6bff3-10b0-4969-b7ef-f31cee80091d/kube-state-metrics/0.log" Dec 16 15:56:48 crc kubenswrapper[4728]: I1216 15:56:48.828992 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fg5md_355982cb-601d-4505-926c-8fa80bd4f3b6/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:49 crc kubenswrapper[4728]: I1216 15:56:49.073226 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-79c9d99cd5-967vg_fcc16f45-2441-47bf-a452-25f78e044a7e/neutron-api/0.log" Dec 16 15:56:49 crc kubenswrapper[4728]: I1216 15:56:49.235338 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-79c9d99cd5-967vg_fcc16f45-2441-47bf-a452-25f78e044a7e/neutron-httpd/0.log" Dec 16 15:56:49 crc kubenswrapper[4728]: I1216 15:56:49.326246 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt_545f4f42-f672-4cd9-8050-296aa0dd57b8/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:49 crc kubenswrapper[4728]: I1216 15:56:49.758030 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_06dceccb-f462-4eec-b6eb-e7b626c54b66/nova-cell0-conductor-conductor/0.log" Dec 16 15:56:49 crc kubenswrapper[4728]: I1216 15:56:49.882594 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8c311506-90af-4f99-867d-aa1f1b5d2d74/nova-api-log/0.log" Dec 16 15:56:50 crc kubenswrapper[4728]: I1216 15:56:50.056648 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8c311506-90af-4f99-867d-aa1f1b5d2d74/nova-api-api/0.log" Dec 16 15:56:50 crc kubenswrapper[4728]: I1216 15:56:50.093253 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_170e3d88-1e9a-4e6b-aead-ced16b98610e/nova-cell1-conductor-conductor/0.log" Dec 16 15:56:50 crc kubenswrapper[4728]: I1216 15:56:50.219857 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc/nova-cell1-novncproxy-novncproxy/0.log" Dec 16 15:56:50 crc kubenswrapper[4728]: I1216 15:56:50.360880 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-77qmn_655f0b26-df18-45a2-a9f9-24df853d48ed/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:50 crc kubenswrapper[4728]: I1216 15:56:50.488193 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f1f3cd14-f2b0-4fde-a31e-e686b154eb77/nova-metadata-log/0.log" Dec 16 15:56:50 crc kubenswrapper[4728]: I1216 15:56:50.784063 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ba3dbe8b-ebcb-47f8-8f81-924aec84c326/nova-scheduler-scheduler/0.log" Dec 16 15:56:50 crc kubenswrapper[4728]: I1216 15:56:50.785961 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76f2644a-8bb9-4719-83dd-429202a52446/mysql-bootstrap/0.log" Dec 16 15:56:51 crc kubenswrapper[4728]: I1216 15:56:51.029515 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76f2644a-8bb9-4719-83dd-429202a52446/galera/0.log" Dec 16 15:56:51 crc kubenswrapper[4728]: I1216 15:56:51.047677 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76f2644a-8bb9-4719-83dd-429202a52446/mysql-bootstrap/0.log" Dec 16 15:56:51 crc kubenswrapper[4728]: I1216 15:56:51.253956 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_eb629e93-c552-47c3-8c89-11254ffa834f/mysql-bootstrap/0.log" Dec 16 15:56:51 crc kubenswrapper[4728]: I1216 15:56:51.422947 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_eb629e93-c552-47c3-8c89-11254ffa834f/mysql-bootstrap/0.log" Dec 16 15:56:51 crc kubenswrapper[4728]: I1216 15:56:51.528664 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_eb629e93-c552-47c3-8c89-11254ffa834f/galera/0.log" Dec 16 15:56:51 crc kubenswrapper[4728]: I1216 15:56:51.625176 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_09f99482-afc8-48dd-95a3-ada07d611db1/openstackclient/0.log" Dec 16 15:56:51 crc kubenswrapper[4728]: I1216 15:56:51.726377 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f1f3cd14-f2b0-4fde-a31e-e686b154eb77/nova-metadata-metadata/0.log" Dec 16 15:56:51 crc kubenswrapper[4728]: I1216 15:56:51.753095 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hlkkv_37c82b8b-fe2d-4265-80b1-7cdfa00e2be7/ovn-controller/0.log" Dec 16 15:56:51 crc kubenswrapper[4728]: I1216 15:56:51.994706 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ccc6t_effa7d99-cccc-431b-91b6-d4302f7dce22/openstack-network-exporter/0.log" Dec 16 15:56:52 crc kubenswrapper[4728]: I1216 15:56:52.011730 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4m68_865baf70-58f9-4eee-8cf4-d5e96e6d011e/ovsdb-server-init/0.log" Dec 16 15:56:52 crc kubenswrapper[4728]: I1216 15:56:52.186514 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4m68_865baf70-58f9-4eee-8cf4-d5e96e6d011e/ovs-vswitchd/0.log" Dec 16 15:56:52 crc kubenswrapper[4728]: I1216 15:56:52.197373 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4m68_865baf70-58f9-4eee-8cf4-d5e96e6d011e/ovsdb-server/0.log" Dec 16 15:56:52 crc kubenswrapper[4728]: I1216 15:56:52.239477 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4m68_865baf70-58f9-4eee-8cf4-d5e96e6d011e/ovsdb-server-init/0.log" Dec 16 15:56:52 crc kubenswrapper[4728]: I1216 15:56:52.431661 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xxw58_2767eeb4-bf6d-4381-8277-c6d99cad99a5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:52 crc kubenswrapper[4728]: I1216 15:56:52.468803 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d390ca4f-5aa2-45e8-a08e-b2e86218e36f/openstack-network-exporter/0.log" Dec 16 15:56:52 crc kubenswrapper[4728]: I1216 15:56:52.546200 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d390ca4f-5aa2-45e8-a08e-b2e86218e36f/ovn-northd/0.log" Dec 16 15:56:52 crc kubenswrapper[4728]: I1216 15:56:52.675806 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b2809df7-1873-474c-ab44-14b82f630cb0/openstack-network-exporter/0.log" Dec 16 15:56:52 crc kubenswrapper[4728]: I1216 15:56:52.680733 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b2809df7-1873-474c-ab44-14b82f630cb0/ovsdbserver-nb/0.log" Dec 16 15:56:52 crc kubenswrapper[4728]: I1216 15:56:52.823944 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d587bd5e-c0c9-48f1-a2b6-616e904ceed3/openstack-network-exporter/0.log" Dec 16 15:56:52 crc kubenswrapper[4728]: I1216 15:56:52.881014 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d587bd5e-c0c9-48f1-a2b6-616e904ceed3/ovsdbserver-sb/0.log" Dec 16 15:56:53 crc kubenswrapper[4728]: I1216 15:56:53.063937 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7dcd7544cd-gnxgg_7ac43e45-8d37-4ab4-9ebe-441421fe9044/placement-api/0.log" Dec 16 15:56:53 crc kubenswrapper[4728]: I1216 15:56:53.113793 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e64ff4ca-1141-477e-8db1-b2068e3b6d9a/setup-container/0.log" Dec 16 15:56:53 crc kubenswrapper[4728]: I1216 15:56:53.138342 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7dcd7544cd-gnxgg_7ac43e45-8d37-4ab4-9ebe-441421fe9044/placement-log/0.log" Dec 16 15:56:53 crc kubenswrapper[4728]: I1216 15:56:53.353918 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e64ff4ca-1141-477e-8db1-b2068e3b6d9a/setup-container/0.log" Dec 16 15:56:53 crc kubenswrapper[4728]: I1216 15:56:53.428836 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e19aee19-231d-4847-9e7e-78b8745576ae/setup-container/0.log" Dec 16 15:56:53 crc kubenswrapper[4728]: I1216 15:56:53.446326 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e64ff4ca-1141-477e-8db1-b2068e3b6d9a/rabbitmq/0.log" Dec 16 15:56:53 crc kubenswrapper[4728]: I1216 15:56:53.601887 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e19aee19-231d-4847-9e7e-78b8745576ae/setup-container/0.log" Dec 16 15:56:53 crc kubenswrapper[4728]: I1216 15:56:53.659228 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e19aee19-231d-4847-9e7e-78b8745576ae/rabbitmq/0.log" Dec 16 15:56:53 crc kubenswrapper[4728]: I1216 15:56:53.701857 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79_8154d34c-28e4-4d89-a271-f1b2fb4daa29/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:53 crc kubenswrapper[4728]: I1216 15:56:53.902420 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fbn4q_447d1f35-7fe1-4655-8893-3ca4afed13d6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:53 crc kubenswrapper[4728]: I1216 15:56:53.918976 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx_37e0ae2a-b0ba-45a7-9395-1af1365adf86/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:54 crc kubenswrapper[4728]: I1216 15:56:54.080244 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-qpfkk_6270b5fc-f711-41e7-b66c-1cac1f2f3b43/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:54 crc kubenswrapper[4728]: I1216 15:56:54.145367 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9gcfz_a9b27bc6-f730-4cc8-a626-de82d2c022b8/ssh-known-hosts-edpm-deployment/0.log" Dec 16 15:56:54 crc kubenswrapper[4728]: I1216 15:56:54.406730 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-54bb7475-hxsvl_b5b59721-592a-4649-8246-0487a18177b9/proxy-server/0.log" Dec 16 15:56:54 crc kubenswrapper[4728]: I1216 15:56:54.507587 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-54bb7475-hxsvl_b5b59721-592a-4649-8246-0487a18177b9/proxy-httpd/0.log" Dec 16 15:56:54 crc kubenswrapper[4728]: I1216 15:56:54.628784 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vvql8_55ebd6bb-cac2-4b8f-932d-46662c011b18/swift-ring-rebalance/0.log" Dec 16 15:56:54 crc kubenswrapper[4728]: I1216 15:56:54.671887 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/account-auditor/0.log" Dec 16 15:56:54 crc kubenswrapper[4728]: I1216 15:56:54.778303 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/account-reaper/0.log" Dec 16 15:56:54 crc kubenswrapper[4728]: I1216 15:56:54.880503 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/account-server/0.log" Dec 16 15:56:54 crc kubenswrapper[4728]: I1216 15:56:54.889098 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/account-replicator/0.log" Dec 16 15:56:54 crc kubenswrapper[4728]: I1216 15:56:54.942898 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/container-auditor/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.043666 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/container-replicator/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.116334 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/container-server/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.121940 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/container-updater/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.189850 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/object-auditor/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.251632 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/object-expirer/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.350514 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/object-server/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.365929 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/object-replicator/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.412222 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/object-updater/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.486163 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/rsync/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.532485 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/swift-recon-cron/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.707529 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt_e3adc58c-a09d-4e32-bd59-10d32f1866ca/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.822885 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_78bce531-8ad9-43f3-9d5a-2edaf2df712f/tempest-tests-tempest-tests-runner/0.log" Dec 16 15:56:55 crc kubenswrapper[4728]: I1216 15:56:55.915944 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_da534e2d-cb12-451d-b5bf-16b7943c82bb/test-operator-logs-container/0.log" Dec 16 15:56:56 crc kubenswrapper[4728]: I1216 15:56:56.078951 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns_b642fac2-b01e-4ec8-80dc-3193414e335c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 15:57:04 crc kubenswrapper[4728]: I1216 15:57:04.625370 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8cf2b12c-4959-429e-b9db-173f5ddfab90/memcached/0.log" Dec 16 15:57:08 crc kubenswrapper[4728]: I1216 15:57:08.818423 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:57:08 crc kubenswrapper[4728]: I1216 15:57:08.819018 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:57:20 crc kubenswrapper[4728]: I1216 15:57:20.032976 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-95949466-l6vt8_12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c/manager/0.log" Dec 16 15:57:20 crc kubenswrapper[4728]: I1216 15:57:20.255746 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-qtz4v_84531a1b-f019-449d-8779-05b03bde07cb/manager/0.log" Dec 16 15:57:20 crc kubenswrapper[4728]: I1216 15:57:20.257979 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-6sdq7_dbf95255-3fe3-4421-be60-212514fef21c/manager/0.log" Dec 16 15:57:20 crc kubenswrapper[4728]: I1216 15:57:20.380669 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/util/0.log" Dec 16 15:57:20 crc kubenswrapper[4728]: I1216 15:57:20.533692 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/util/0.log" Dec 16 15:57:20 crc kubenswrapper[4728]: I1216 15:57:20.550050 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/pull/0.log" Dec 16 15:57:20 crc kubenswrapper[4728]: I1216 15:57:20.550123 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/pull/0.log" Dec 16 15:57:20 crc kubenswrapper[4728]: I1216 15:57:20.739691 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/pull/0.log" Dec 16 15:57:20 crc kubenswrapper[4728]: I1216 15:57:20.744161 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/util/0.log" Dec 16 15:57:20 crc kubenswrapper[4728]: I1216 15:57:20.794329 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/extract/0.log" Dec 16 15:57:21 crc kubenswrapper[4728]: I1216 15:57:21.314554 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-59b8dcb766-qpsk9_90e228b6-e35d-4ee2-992c-364b4abd8436/manager/0.log" Dec 16 15:57:21 crc kubenswrapper[4728]: I1216 15:57:21.608443 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccf486b9-hcdxf_0252b186-dc46-4cca-ba92-9855cb2aa4ec/manager/0.log" Dec 16 15:57:21 crc kubenswrapper[4728]: I1216 15:57:21.687560 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-767f9d7567-wn6qf_f5364dc6-650d-427d-aab6-c50ba3d69b75/manager/0.log" Dec 16 15:57:21 crc kubenswrapper[4728]: I1216 15:57:21.775624 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-ttkv5_660d7a4f-e56a-42c8-8db6-d1f7285d7d04/manager/0.log" Dec 16 15:57:21 crc kubenswrapper[4728]: I1216 15:57:21.860012 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84b495f78-ljkxp_a8ceccb7-c74c-42c4-a763-d947892f942d/manager/0.log" Dec 16 15:57:21 crc kubenswrapper[4728]: I1216 15:57:21.980239 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5c7cbf548f-mns5x_0cc3d254-9633-4e63-91a8-719af70696f6/manager/0.log" Dec 16 15:57:22 crc kubenswrapper[4728]: I1216 15:57:22.024693 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fdd9786f7-mfc2h_59a84980-fdf4-4ff3-b8c7-464e1423bad3/manager/0.log" Dec 16 15:57:22 crc kubenswrapper[4728]: I1216 15:57:22.258829 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f76f4954c-t6vdg_89d4ec07-baef-4061-b6d8-e50f3ab47bb1/manager/0.log" Dec 16 15:57:22 crc kubenswrapper[4728]: I1216 15:57:22.338676 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-xmv9j_fe17017f-5157-4d72-bb40-58a456517c3e/manager/0.log" Dec 16 15:57:22 crc kubenswrapper[4728]: I1216 15:57:22.458649 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-4jbw4_6b5beb20-1139-4774-8ea6-b5c951a6cbba/manager/0.log" Dec 16 15:57:22 crc kubenswrapper[4728]: I1216 15:57:22.508377 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-j7jxc_e501f8ed-3791-4661-8c3e-bfb4eaeeb64d/manager/0.log" Dec 16 15:57:22 crc kubenswrapper[4728]: I1216 15:57:22.679516 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl_a4b04d21-7de1-4565-99e6-fbeb59a0fde6/manager/0.log" Dec 16 15:57:23 crc kubenswrapper[4728]: I1216 15:57:23.100837 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-9hjx2_3818a60e-feb9-4ae0-a15a-48c59870b921/registry-server/0.log" Dec 16 15:57:23 crc kubenswrapper[4728]: I1216 15:57:23.143533 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6bdf96f7b8-fqbkd_d6a45f52-4776-491e-a850-afe8d2efa914/operator/0.log" Dec 16 15:57:23 crc kubenswrapper[4728]: I1216 15:57:23.445765 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-68vvq_7a8c4b97-2de8-4235-aa76-c8382c5c5cb1/manager/0.log" Dec 16 15:57:23 crc kubenswrapper[4728]: I1216 15:57:23.466666 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8665b56d78-zcz8p_75c9a0f4-94bc-4bf5-b164-149256d1a214/manager/0.log" Dec 16 15:57:23 crc kubenswrapper[4728]: I1216 15:57:23.672951 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-f9pgp_160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e/operator/0.log" Dec 16 15:57:23 crc kubenswrapper[4728]: I1216 15:57:23.849150 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-757cf4457b-v8kt9_0def48bf-646d-4641-93b5-a9e4e058cc67/manager/0.log" Dec 16 15:57:23 crc kubenswrapper[4728]: I1216 15:57:23.901661 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c6df8f9-xvvjw_9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4/manager/0.log" Dec 16 15:57:24 crc kubenswrapper[4728]: I1216 15:57:24.110555 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-9p2mz_8324ae5e-23f8-4267-9822-a4ae37c7cd5a/manager/0.log" Dec 16 15:57:24 crc kubenswrapper[4728]: I1216 15:57:24.172118 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55f78b7c4c-dz6x8_66c25f6d-85c4-4e3e-bf44-93499cc2321c/manager/0.log" Dec 16 15:57:24 crc kubenswrapper[4728]: I1216 15:57:24.322554 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-97d456b9-n6x46_f155db6c-255a-4401-884a-b48825bb93c7/manager/0.log" Dec 16 15:57:38 crc kubenswrapper[4728]: I1216 15:57:38.818430 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 15:57:38 crc kubenswrapper[4728]: I1216 15:57:38.818999 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 15:57:38 crc kubenswrapper[4728]: I1216 15:57:38.819061 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 15:57:38 crc kubenswrapper[4728]: I1216 15:57:38.819732 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6057ec8bf018ddc636f1fdb8a788ff08f7c76a9cc99e3eea7068955f5d92eec4"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 15:57:38 crc kubenswrapper[4728]: I1216 15:57:38.819846 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://6057ec8bf018ddc636f1fdb8a788ff08f7c76a9cc99e3eea7068955f5d92eec4" gracePeriod=600 Dec 16 15:57:39 crc kubenswrapper[4728]: I1216 15:57:39.191048 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="6057ec8bf018ddc636f1fdb8a788ff08f7c76a9cc99e3eea7068955f5d92eec4" exitCode=0 Dec 16 15:57:39 crc kubenswrapper[4728]: I1216 15:57:39.191123 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"6057ec8bf018ddc636f1fdb8a788ff08f7c76a9cc99e3eea7068955f5d92eec4"} Dec 16 15:57:39 crc kubenswrapper[4728]: I1216 15:57:39.191473 4728 scope.go:117] "RemoveContainer" containerID="042dddb153dee23bfea915081c4f74f19b859e4587293c0bdf6fa0f2ac7c2c55" Dec 16 15:57:41 crc kubenswrapper[4728]: I1216 15:57:41.212566 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d"} Dec 16 15:57:43 crc kubenswrapper[4728]: I1216 15:57:43.964023 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zf5xv_14a40ff4-9558-428f-a784-c18c5d62d60a/control-plane-machine-set-operator/0.log" Dec 16 15:57:44 crc kubenswrapper[4728]: I1216 15:57:44.115653 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pfz7w_6ef09dcb-9a41-4fb0-8492-cdd81b0222fe/kube-rbac-proxy/0.log" Dec 16 15:57:44 crc kubenswrapper[4728]: I1216 15:57:44.136795 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pfz7w_6ef09dcb-9a41-4fb0-8492-cdd81b0222fe/machine-api-operator/0.log" Dec 16 15:57:57 crc kubenswrapper[4728]: I1216 15:57:57.488287 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-xlhf9_98db182b-146e-48eb-918d-ff62909f62de/cert-manager-controller/0.log" Dec 16 15:57:57 crc kubenswrapper[4728]: I1216 15:57:57.710694 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9x457_14b59c49-2ca7-4fd1-96a7-926474663fc8/cert-manager-webhook/0.log" Dec 16 15:57:57 crc kubenswrapper[4728]: I1216 15:57:57.738264 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qtpbj_86fdf4d9-bff1-40f5-b1f7-7d74536c7f39/cert-manager-cainjector/0.log" Dec 16 15:58:09 crc kubenswrapper[4728]: I1216 15:58:09.823833 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-zzbs8_84209333-74b0-4804-ac6e-e829f0ec1bc7/nmstate-console-plugin/0.log" Dec 16 15:58:10 crc kubenswrapper[4728]: I1216 15:58:10.005082 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-fkppc_22a101d0-c77f-42c4-88e7-ff7bfb0c204d/kube-rbac-proxy/0.log" Dec 16 15:58:10 crc kubenswrapper[4728]: I1216 15:58:10.008245 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hd8rz_d61cf9e1-67c0-4258-af87-e4244df3c68e/nmstate-handler/0.log" Dec 16 15:58:10 crc kubenswrapper[4728]: I1216 15:58:10.060819 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-fkppc_22a101d0-c77f-42c4-88e7-ff7bfb0c204d/nmstate-metrics/0.log" Dec 16 15:58:10 crc kubenswrapper[4728]: I1216 15:58:10.242991 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-9mppg_fdf13fea-12cb-4713-bae2-3cabd3aae756/nmstate-operator/0.log" Dec 16 15:58:10 crc kubenswrapper[4728]: I1216 15:58:10.249877 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-r6qf9_f35491e6-33aa-4c1d-a9c0-1b95f43ad54f/nmstate-webhook/0.log" Dec 16 15:58:24 crc kubenswrapper[4728]: I1216 15:58:24.297464 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-x5g2t_bd55b5d2-c827-4b76-bd1e-e1c033737650/kube-rbac-proxy/0.log" Dec 16 15:58:24 crc kubenswrapper[4728]: I1216 15:58:24.406689 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-x5g2t_bd55b5d2-c827-4b76-bd1e-e1c033737650/controller/0.log" Dec 16 15:58:24 crc kubenswrapper[4728]: I1216 15:58:24.497836 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-frr-files/0.log" Dec 16 15:58:24 crc kubenswrapper[4728]: I1216 15:58:24.645117 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-reloader/0.log" Dec 16 15:58:24 crc kubenswrapper[4728]: I1216 15:58:24.691684 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-frr-files/0.log" Dec 16 15:58:24 crc kubenswrapper[4728]: I1216 15:58:24.701301 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-metrics/0.log" Dec 16 15:58:24 crc kubenswrapper[4728]: I1216 15:58:24.709447 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-reloader/0.log" Dec 16 15:58:24 crc kubenswrapper[4728]: I1216 15:58:24.857918 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-frr-files/0.log" Dec 16 15:58:24 crc kubenswrapper[4728]: I1216 15:58:24.865819 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-reloader/0.log" Dec 16 15:58:24 crc kubenswrapper[4728]: I1216 15:58:24.866446 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-metrics/0.log" Dec 16 15:58:24 crc kubenswrapper[4728]: I1216 15:58:24.922147 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-metrics/0.log" Dec 16 15:58:25 crc kubenswrapper[4728]: I1216 15:58:25.087971 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-frr-files/0.log" Dec 16 15:58:25 crc kubenswrapper[4728]: I1216 15:58:25.102793 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-reloader/0.log" Dec 16 15:58:25 crc kubenswrapper[4728]: I1216 15:58:25.103926 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-metrics/0.log" Dec 16 15:58:25 crc kubenswrapper[4728]: I1216 15:58:25.145899 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/controller/0.log" Dec 16 15:58:25 crc kubenswrapper[4728]: I1216 15:58:25.305700 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/frr-metrics/0.log" Dec 16 15:58:25 crc kubenswrapper[4728]: I1216 15:58:25.349334 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/kube-rbac-proxy-frr/0.log" Dec 16 15:58:25 crc kubenswrapper[4728]: I1216 15:58:25.349560 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/kube-rbac-proxy/0.log" Dec 16 15:58:25 crc kubenswrapper[4728]: I1216 15:58:25.554972 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/reloader/0.log" Dec 16 15:58:25 crc kubenswrapper[4728]: I1216 15:58:25.556030 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-5w4gq_afa56798-790e-42c2-98af-9e0f7313603c/frr-k8s-webhook-server/0.log" Dec 16 15:58:25 crc kubenswrapper[4728]: I1216 15:58:25.807471 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-fd5945654-clj75_d1b1e578-b0a6-446b-90d1-7df5d4d4a43a/manager/0.log" Dec 16 15:58:25 crc kubenswrapper[4728]: I1216 15:58:25.957662 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6dfbdf4c69-n5ksx_55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9/webhook-server/0.log" Dec 16 15:58:26 crc kubenswrapper[4728]: I1216 15:58:26.089724 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-872z5_0c9a8885-9664-4048-bce4-8fc1cab033d8/kube-rbac-proxy/0.log" Dec 16 15:58:26 crc kubenswrapper[4728]: I1216 15:58:26.607749 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-872z5_0c9a8885-9664-4048-bce4-8fc1cab033d8/speaker/0.log" Dec 16 15:58:26 crc kubenswrapper[4728]: I1216 15:58:26.729986 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/frr/0.log" Dec 16 15:58:39 crc kubenswrapper[4728]: I1216 15:58:39.439796 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/util/0.log" Dec 16 15:58:39 crc kubenswrapper[4728]: I1216 15:58:39.665972 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/pull/0.log" Dec 16 15:58:39 crc kubenswrapper[4728]: I1216 15:58:39.667023 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/util/0.log" Dec 16 15:58:39 crc kubenswrapper[4728]: I1216 15:58:39.693838 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/pull/0.log" Dec 16 15:58:39 crc kubenswrapper[4728]: I1216 15:58:39.817501 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/util/0.log" Dec 16 15:58:39 crc kubenswrapper[4728]: I1216 15:58:39.837433 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/pull/0.log" Dec 16 15:58:39 crc kubenswrapper[4728]: I1216 15:58:39.897764 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/extract/0.log" Dec 16 15:58:39 crc kubenswrapper[4728]: I1216 15:58:39.986751 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/util/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.176210 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/pull/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.188077 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/util/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.200977 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/pull/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.385058 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/extract/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.400887 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/util/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.401749 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/pull/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.565432 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-utilities/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.753391 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-utilities/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.763674 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-content/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.767659 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-content/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.891664 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-utilities/0.log" Dec 16 15:58:40 crc kubenswrapper[4728]: I1216 15:58:40.927731 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-content/0.log" Dec 16 15:58:41 crc kubenswrapper[4728]: I1216 15:58:41.092263 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-utilities/0.log" Dec 16 15:58:41 crc kubenswrapper[4728]: I1216 15:58:41.377494 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-utilities/0.log" Dec 16 15:58:41 crc kubenswrapper[4728]: I1216 15:58:41.411296 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-content/0.log" Dec 16 15:58:41 crc kubenswrapper[4728]: I1216 15:58:41.472198 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-content/0.log" Dec 16 15:58:41 crc kubenswrapper[4728]: I1216 15:58:41.581365 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/registry-server/0.log" Dec 16 15:58:41 crc kubenswrapper[4728]: I1216 15:58:41.649890 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-utilities/0.log" Dec 16 15:58:41 crc kubenswrapper[4728]: I1216 15:58:41.733861 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-content/0.log" Dec 16 15:58:41 crc kubenswrapper[4728]: I1216 15:58:41.988624 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-dhkkk_28557b66-a02a-4c9e-880f-3d9f21e5892b/marketplace-operator/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.056907 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/registry-server/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.085850 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-utilities/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.207795 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-utilities/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.225542 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-content/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.254137 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-content/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.416286 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-utilities/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.462119 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-content/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.585891 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/registry-server/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.647666 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-utilities/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.804710 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-content/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.804853 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-utilities/0.log" Dec 16 15:58:42 crc kubenswrapper[4728]: I1216 15:58:42.825026 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-content/0.log" Dec 16 15:58:43 crc kubenswrapper[4728]: I1216 15:58:43.021979 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-content/0.log" Dec 16 15:58:43 crc kubenswrapper[4728]: I1216 15:58:43.033929 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-utilities/0.log" Dec 16 15:58:43 crc kubenswrapper[4728]: I1216 15:58:43.312137 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/registry-server/0.log" Dec 16 15:59:01 crc kubenswrapper[4728]: E1216 15:59:01.136992 4728 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.210:49292->38.102.83.210:34353: write tcp 38.102.83.210:49292->38.102.83.210:34353: write: broken pipe Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.177169 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g"] Dec 16 16:00:00 crc kubenswrapper[4728]: E1216 16:00:00.177965 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16804427-3a97-4b04-9ff1-132371e8c396" containerName="container-00" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.177977 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="16804427-3a97-4b04-9ff1-132371e8c396" containerName="container-00" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.178187 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="16804427-3a97-4b04-9ff1-132371e8c396" containerName="container-00" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.178771 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.183717 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.185587 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.207306 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g"] Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.223475 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1892e45a-a7a1-4de3-b7ff-6de653569f48-config-volume\") pod \"collect-profiles-29431680-ss99g\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.223556 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1892e45a-a7a1-4de3-b7ff-6de653569f48-secret-volume\") pod \"collect-profiles-29431680-ss99g\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.223636 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfvmx\" (UniqueName: \"kubernetes.io/projected/1892e45a-a7a1-4de3-b7ff-6de653569f48-kube-api-access-tfvmx\") pod \"collect-profiles-29431680-ss99g\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.325043 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1892e45a-a7a1-4de3-b7ff-6de653569f48-secret-volume\") pod \"collect-profiles-29431680-ss99g\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.325180 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfvmx\" (UniqueName: \"kubernetes.io/projected/1892e45a-a7a1-4de3-b7ff-6de653569f48-kube-api-access-tfvmx\") pod \"collect-profiles-29431680-ss99g\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.325270 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1892e45a-a7a1-4de3-b7ff-6de653569f48-config-volume\") pod \"collect-profiles-29431680-ss99g\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.326168 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1892e45a-a7a1-4de3-b7ff-6de653569f48-config-volume\") pod \"collect-profiles-29431680-ss99g\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.332474 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1892e45a-a7a1-4de3-b7ff-6de653569f48-secret-volume\") pod \"collect-profiles-29431680-ss99g\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.359017 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfvmx\" (UniqueName: \"kubernetes.io/projected/1892e45a-a7a1-4de3-b7ff-6de653569f48-kube-api-access-tfvmx\") pod \"collect-profiles-29431680-ss99g\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:00 crc kubenswrapper[4728]: I1216 16:00:00.496398 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:01 crc kubenswrapper[4728]: I1216 16:00:01.008857 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g"] Dec 16 16:00:01 crc kubenswrapper[4728]: W1216 16:00:01.038187 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1892e45a_a7a1_4de3_b7ff_6de653569f48.slice/crio-ffc8710d92e55cd55c765faaa7989dba5e6723b975a6ebadc2018aab1a5c1282 WatchSource:0}: Error finding container ffc8710d92e55cd55c765faaa7989dba5e6723b975a6ebadc2018aab1a5c1282: Status 404 returned error can't find the container with id ffc8710d92e55cd55c765faaa7989dba5e6723b975a6ebadc2018aab1a5c1282 Dec 16 16:00:01 crc kubenswrapper[4728]: I1216 16:00:01.546941 4728 generic.go:334] "Generic (PLEG): container finished" podID="1892e45a-a7a1-4de3-b7ff-6de653569f48" containerID="0f2d59ded8b5c1f543000c29958f2c2cc3bbb75f589b0ff99df22786b568f225" exitCode=0 Dec 16 16:00:01 crc kubenswrapper[4728]: I1216 16:00:01.547222 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" event={"ID":"1892e45a-a7a1-4de3-b7ff-6de653569f48","Type":"ContainerDied","Data":"0f2d59ded8b5c1f543000c29958f2c2cc3bbb75f589b0ff99df22786b568f225"} Dec 16 16:00:01 crc kubenswrapper[4728]: I1216 16:00:01.547279 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" event={"ID":"1892e45a-a7a1-4de3-b7ff-6de653569f48","Type":"ContainerStarted","Data":"ffc8710d92e55cd55c765faaa7989dba5e6723b975a6ebadc2018aab1a5c1282"} Dec 16 16:00:02 crc kubenswrapper[4728]: I1216 16:00:02.943534 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:02 crc kubenswrapper[4728]: I1216 16:00:02.998395 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1892e45a-a7a1-4de3-b7ff-6de653569f48-config-volume\") pod \"1892e45a-a7a1-4de3-b7ff-6de653569f48\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " Dec 16 16:00:02 crc kubenswrapper[4728]: I1216 16:00:02.998498 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfvmx\" (UniqueName: \"kubernetes.io/projected/1892e45a-a7a1-4de3-b7ff-6de653569f48-kube-api-access-tfvmx\") pod \"1892e45a-a7a1-4de3-b7ff-6de653569f48\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " Dec 16 16:00:02 crc kubenswrapper[4728]: I1216 16:00:02.998554 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1892e45a-a7a1-4de3-b7ff-6de653569f48-secret-volume\") pod \"1892e45a-a7a1-4de3-b7ff-6de653569f48\" (UID: \"1892e45a-a7a1-4de3-b7ff-6de653569f48\") " Dec 16 16:00:02 crc kubenswrapper[4728]: I1216 16:00:02.999943 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1892e45a-a7a1-4de3-b7ff-6de653569f48-config-volume" (OuterVolumeSpecName: "config-volume") pod "1892e45a-a7a1-4de3-b7ff-6de653569f48" (UID: "1892e45a-a7a1-4de3-b7ff-6de653569f48"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 16 16:00:03 crc kubenswrapper[4728]: I1216 16:00:03.004647 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1892e45a-a7a1-4de3-b7ff-6de653569f48-kube-api-access-tfvmx" (OuterVolumeSpecName: "kube-api-access-tfvmx") pod "1892e45a-a7a1-4de3-b7ff-6de653569f48" (UID: "1892e45a-a7a1-4de3-b7ff-6de653569f48"). InnerVolumeSpecName "kube-api-access-tfvmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:00:03 crc kubenswrapper[4728]: I1216 16:00:03.004758 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1892e45a-a7a1-4de3-b7ff-6de653569f48-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1892e45a-a7a1-4de3-b7ff-6de653569f48" (UID: "1892e45a-a7a1-4de3-b7ff-6de653569f48"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 16:00:03 crc kubenswrapper[4728]: I1216 16:00:03.101452 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfvmx\" (UniqueName: \"kubernetes.io/projected/1892e45a-a7a1-4de3-b7ff-6de653569f48-kube-api-access-tfvmx\") on node \"crc\" DevicePath \"\"" Dec 16 16:00:03 crc kubenswrapper[4728]: I1216 16:00:03.101515 4728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1892e45a-a7a1-4de3-b7ff-6de653569f48-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 16 16:00:03 crc kubenswrapper[4728]: I1216 16:00:03.101536 4728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1892e45a-a7a1-4de3-b7ff-6de653569f48-config-volume\") on node \"crc\" DevicePath \"\"" Dec 16 16:00:03 crc kubenswrapper[4728]: I1216 16:00:03.579217 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" event={"ID":"1892e45a-a7a1-4de3-b7ff-6de653569f48","Type":"ContainerDied","Data":"ffc8710d92e55cd55c765faaa7989dba5e6723b975a6ebadc2018aab1a5c1282"} Dec 16 16:00:03 crc kubenswrapper[4728]: I1216 16:00:03.579775 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffc8710d92e55cd55c765faaa7989dba5e6723b975a6ebadc2018aab1a5c1282" Dec 16 16:00:03 crc kubenswrapper[4728]: I1216 16:00:03.579357 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29431680-ss99g" Dec 16 16:00:04 crc kubenswrapper[4728]: I1216 16:00:04.041217 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc"] Dec 16 16:00:04 crc kubenswrapper[4728]: I1216 16:00:04.049823 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29431635-9jtzc"] Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.410673 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b2f7f"] Dec 16 16:00:05 crc kubenswrapper[4728]: E1216 16:00:05.412388 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1892e45a-a7a1-4de3-b7ff-6de653569f48" containerName="collect-profiles" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.412572 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="1892e45a-a7a1-4de3-b7ff-6de653569f48" containerName="collect-profiles" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.412938 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="1892e45a-a7a1-4de3-b7ff-6de653569f48" containerName="collect-profiles" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.415211 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.422607 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2f7f"] Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.517096 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b159943c-5acb-49ba-951d-fb64f30525d2" path="/var/lib/kubelet/pods/b159943c-5acb-49ba-951d-fb64f30525d2/volumes" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.544546 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-catalog-content\") pod \"community-operators-b2f7f\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.544603 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j46n\" (UniqueName: \"kubernetes.io/projected/5750edfb-4af9-40ff-a8dd-9762c15ff51d-kube-api-access-4j46n\") pod \"community-operators-b2f7f\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.544882 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-utilities\") pod \"community-operators-b2f7f\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.646524 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-utilities\") pod \"community-operators-b2f7f\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.646697 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-catalog-content\") pod \"community-operators-b2f7f\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.646739 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j46n\" (UniqueName: \"kubernetes.io/projected/5750edfb-4af9-40ff-a8dd-9762c15ff51d-kube-api-access-4j46n\") pod \"community-operators-b2f7f\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.647088 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-utilities\") pod \"community-operators-b2f7f\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.647129 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-catalog-content\") pod \"community-operators-b2f7f\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.672450 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j46n\" (UniqueName: \"kubernetes.io/projected/5750edfb-4af9-40ff-a8dd-9762c15ff51d-kube-api-access-4j46n\") pod \"community-operators-b2f7f\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:05 crc kubenswrapper[4728]: I1216 16:00:05.740130 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:06 crc kubenswrapper[4728]: I1216 16:00:06.351394 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2f7f"] Dec 16 16:00:06 crc kubenswrapper[4728]: I1216 16:00:06.615083 4728 generic.go:334] "Generic (PLEG): container finished" podID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" containerID="e4243c7c2a17f4009ea3233586198693a9039f867818f4906da5d250b0617861" exitCode=0 Dec 16 16:00:06 crc kubenswrapper[4728]: I1216 16:00:06.615119 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2f7f" event={"ID":"5750edfb-4af9-40ff-a8dd-9762c15ff51d","Type":"ContainerDied","Data":"e4243c7c2a17f4009ea3233586198693a9039f867818f4906da5d250b0617861"} Dec 16 16:00:06 crc kubenswrapper[4728]: I1216 16:00:06.615143 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2f7f" event={"ID":"5750edfb-4af9-40ff-a8dd-9762c15ff51d","Type":"ContainerStarted","Data":"10a8d7ce76cf254ee0beb3332e82886a0ef0be1ffb9ac8b7965b243d90770001"} Dec 16 16:00:08 crc kubenswrapper[4728]: I1216 16:00:08.818468 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:00:08 crc kubenswrapper[4728]: I1216 16:00:08.819696 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:00:10 crc kubenswrapper[4728]: I1216 16:00:10.647929 4728 generic.go:334] "Generic (PLEG): container finished" podID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" containerID="6110942fc25284af740369b6ac6ffbd21fba0e85427a63ddbaca00394fd50403" exitCode=0 Dec 16 16:00:10 crc kubenswrapper[4728]: I1216 16:00:10.648075 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2f7f" event={"ID":"5750edfb-4af9-40ff-a8dd-9762c15ff51d","Type":"ContainerDied","Data":"6110942fc25284af740369b6ac6ffbd21fba0e85427a63ddbaca00394fd50403"} Dec 16 16:00:12 crc kubenswrapper[4728]: I1216 16:00:12.676768 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2f7f" event={"ID":"5750edfb-4af9-40ff-a8dd-9762c15ff51d","Type":"ContainerStarted","Data":"9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824"} Dec 16 16:00:15 crc kubenswrapper[4728]: I1216 16:00:15.740291 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:15 crc kubenswrapper[4728]: I1216 16:00:15.740671 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:15 crc kubenswrapper[4728]: I1216 16:00:15.812103 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:15 crc kubenswrapper[4728]: I1216 16:00:15.861457 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b2f7f" podStartSLOduration=6.305115874 podStartE2EDuration="10.861428371s" podCreationTimestamp="2025-12-16 16:00:05 +0000 UTC" firstStartedPulling="2025-12-16 16:00:06.61785064 +0000 UTC m=+3787.458029624" lastFinishedPulling="2025-12-16 16:00:11.174163137 +0000 UTC m=+3792.014342121" observedRunningTime="2025-12-16 16:00:12.699479386 +0000 UTC m=+3793.539658390" watchObservedRunningTime="2025-12-16 16:00:15.861428371 +0000 UTC m=+3796.701607395" Dec 16 16:00:16 crc kubenswrapper[4728]: I1216 16:00:16.799722 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:16 crc kubenswrapper[4728]: I1216 16:00:16.859222 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2f7f"] Dec 16 16:00:18 crc kubenswrapper[4728]: I1216 16:00:18.755251 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b2f7f" podUID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" containerName="registry-server" containerID="cri-o://9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824" gracePeriod=2 Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.121495 4728 scope.go:117] "RemoveContainer" containerID="f54aa5d5c22d3a9276bc6e0addcbecdf3287e642a70fb7b88f139e3be8fb7084" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.289924 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.389676 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-utilities\") pod \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.389788 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-catalog-content\") pod \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.389894 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j46n\" (UniqueName: \"kubernetes.io/projected/5750edfb-4af9-40ff-a8dd-9762c15ff51d-kube-api-access-4j46n\") pod \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\" (UID: \"5750edfb-4af9-40ff-a8dd-9762c15ff51d\") " Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.390824 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-utilities" (OuterVolumeSpecName: "utilities") pod "5750edfb-4af9-40ff-a8dd-9762c15ff51d" (UID: "5750edfb-4af9-40ff-a8dd-9762c15ff51d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.399675 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5750edfb-4af9-40ff-a8dd-9762c15ff51d-kube-api-access-4j46n" (OuterVolumeSpecName: "kube-api-access-4j46n") pod "5750edfb-4af9-40ff-a8dd-9762c15ff51d" (UID: "5750edfb-4af9-40ff-a8dd-9762c15ff51d"). InnerVolumeSpecName "kube-api-access-4j46n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.443948 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5750edfb-4af9-40ff-a8dd-9762c15ff51d" (UID: "5750edfb-4af9-40ff-a8dd-9762c15ff51d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.493154 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j46n\" (UniqueName: \"kubernetes.io/projected/5750edfb-4af9-40ff-a8dd-9762c15ff51d-kube-api-access-4j46n\") on node \"crc\" DevicePath \"\"" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.493217 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.493232 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5750edfb-4af9-40ff-a8dd-9762c15ff51d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.770034 4728 generic.go:334] "Generic (PLEG): container finished" podID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" containerID="9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824" exitCode=0 Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.770091 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2f7f" event={"ID":"5750edfb-4af9-40ff-a8dd-9762c15ff51d","Type":"ContainerDied","Data":"9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824"} Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.770115 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2f7f" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.770125 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2f7f" event={"ID":"5750edfb-4af9-40ff-a8dd-9762c15ff51d","Type":"ContainerDied","Data":"10a8d7ce76cf254ee0beb3332e82886a0ef0be1ffb9ac8b7965b243d90770001"} Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.770146 4728 scope.go:117] "RemoveContainer" containerID="9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.798196 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2f7f"] Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.798711 4728 scope.go:117] "RemoveContainer" containerID="6110942fc25284af740369b6ac6ffbd21fba0e85427a63ddbaca00394fd50403" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.806779 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b2f7f"] Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.826069 4728 scope.go:117] "RemoveContainer" containerID="e4243c7c2a17f4009ea3233586198693a9039f867818f4906da5d250b0617861" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.850371 4728 scope.go:117] "RemoveContainer" containerID="9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824" Dec 16 16:00:19 crc kubenswrapper[4728]: E1216 16:00:19.851713 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824\": container with ID starting with 9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824 not found: ID does not exist" containerID="9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.852728 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824"} err="failed to get container status \"9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824\": rpc error: code = NotFound desc = could not find container \"9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824\": container with ID starting with 9ab8c82e2ebbeeb72beeac5755c8f0460336031349c63fe8faa210b991b78824 not found: ID does not exist" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.852771 4728 scope.go:117] "RemoveContainer" containerID="6110942fc25284af740369b6ac6ffbd21fba0e85427a63ddbaca00394fd50403" Dec 16 16:00:19 crc kubenswrapper[4728]: E1216 16:00:19.853148 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6110942fc25284af740369b6ac6ffbd21fba0e85427a63ddbaca00394fd50403\": container with ID starting with 6110942fc25284af740369b6ac6ffbd21fba0e85427a63ddbaca00394fd50403 not found: ID does not exist" containerID="6110942fc25284af740369b6ac6ffbd21fba0e85427a63ddbaca00394fd50403" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.853205 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6110942fc25284af740369b6ac6ffbd21fba0e85427a63ddbaca00394fd50403"} err="failed to get container status \"6110942fc25284af740369b6ac6ffbd21fba0e85427a63ddbaca00394fd50403\": rpc error: code = NotFound desc = could not find container \"6110942fc25284af740369b6ac6ffbd21fba0e85427a63ddbaca00394fd50403\": container with ID starting with 6110942fc25284af740369b6ac6ffbd21fba0e85427a63ddbaca00394fd50403 not found: ID does not exist" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.853237 4728 scope.go:117] "RemoveContainer" containerID="e4243c7c2a17f4009ea3233586198693a9039f867818f4906da5d250b0617861" Dec 16 16:00:19 crc kubenswrapper[4728]: E1216 16:00:19.853614 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4243c7c2a17f4009ea3233586198693a9039f867818f4906da5d250b0617861\": container with ID starting with e4243c7c2a17f4009ea3233586198693a9039f867818f4906da5d250b0617861 not found: ID does not exist" containerID="e4243c7c2a17f4009ea3233586198693a9039f867818f4906da5d250b0617861" Dec 16 16:00:19 crc kubenswrapper[4728]: I1216 16:00:19.853647 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4243c7c2a17f4009ea3233586198693a9039f867818f4906da5d250b0617861"} err="failed to get container status \"e4243c7c2a17f4009ea3233586198693a9039f867818f4906da5d250b0617861\": rpc error: code = NotFound desc = could not find container \"e4243c7c2a17f4009ea3233586198693a9039f867818f4906da5d250b0617861\": container with ID starting with e4243c7c2a17f4009ea3233586198693a9039f867818f4906da5d250b0617861 not found: ID does not exist" Dec 16 16:00:21 crc kubenswrapper[4728]: I1216 16:00:21.517176 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" path="/var/lib/kubelet/pods/5750edfb-4af9-40ff-a8dd-9762c15ff51d/volumes" Dec 16 16:00:21 crc kubenswrapper[4728]: I1216 16:00:21.793511 4728 generic.go:334] "Generic (PLEG): container finished" podID="745dd14c-6645-41dd-a220-c45f310237d4" containerID="ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810" exitCode=0 Dec 16 16:00:21 crc kubenswrapper[4728]: I1216 16:00:21.793572 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9rv49/must-gather-g26v5" event={"ID":"745dd14c-6645-41dd-a220-c45f310237d4","Type":"ContainerDied","Data":"ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810"} Dec 16 16:00:21 crc kubenswrapper[4728]: I1216 16:00:21.794638 4728 scope.go:117] "RemoveContainer" containerID="ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810" Dec 16 16:00:22 crc kubenswrapper[4728]: I1216 16:00:22.734876 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9rv49_must-gather-g26v5_745dd14c-6645-41dd-a220-c45f310237d4/gather/0.log" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.238140 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9rv49/must-gather-g26v5"] Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.240272 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9rv49/must-gather-g26v5" podUID="745dd14c-6645-41dd-a220-c45f310237d4" containerName="copy" containerID="cri-o://15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b" gracePeriod=2 Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.257102 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9rv49/must-gather-g26v5"] Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.682554 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9rv49_must-gather-g26v5_745dd14c-6645-41dd-a220-c45f310237d4/copy/0.log" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.683212 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/must-gather-g26v5" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.808216 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7zl5\" (UniqueName: \"kubernetes.io/projected/745dd14c-6645-41dd-a220-c45f310237d4-kube-api-access-t7zl5\") pod \"745dd14c-6645-41dd-a220-c45f310237d4\" (UID: \"745dd14c-6645-41dd-a220-c45f310237d4\") " Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.808347 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/745dd14c-6645-41dd-a220-c45f310237d4-must-gather-output\") pod \"745dd14c-6645-41dd-a220-c45f310237d4\" (UID: \"745dd14c-6645-41dd-a220-c45f310237d4\") " Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.828563 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/745dd14c-6645-41dd-a220-c45f310237d4-kube-api-access-t7zl5" (OuterVolumeSpecName: "kube-api-access-t7zl5") pod "745dd14c-6645-41dd-a220-c45f310237d4" (UID: "745dd14c-6645-41dd-a220-c45f310237d4"). InnerVolumeSpecName "kube-api-access-t7zl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.885280 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9rv49_must-gather-g26v5_745dd14c-6645-41dd-a220-c45f310237d4/copy/0.log" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.885945 4728 generic.go:334] "Generic (PLEG): container finished" podID="745dd14c-6645-41dd-a220-c45f310237d4" containerID="15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b" exitCode=143 Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.886031 4728 scope.go:117] "RemoveContainer" containerID="15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.886162 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9rv49/must-gather-g26v5" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.904066 4728 scope.go:117] "RemoveContainer" containerID="ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.911325 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7zl5\" (UniqueName: \"kubernetes.io/projected/745dd14c-6645-41dd-a220-c45f310237d4-kube-api-access-t7zl5\") on node \"crc\" DevicePath \"\"" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.979675 4728 scope.go:117] "RemoveContainer" containerID="15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b" Dec 16 16:00:30 crc kubenswrapper[4728]: E1216 16:00:30.980184 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b\": container with ID starting with 15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b not found: ID does not exist" containerID="15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.980227 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b"} err="failed to get container status \"15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b\": rpc error: code = NotFound desc = could not find container \"15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b\": container with ID starting with 15eccede066d5fdf138d5fe4024c598249cf6f385d527bcccce4ca8f9b42392b not found: ID does not exist" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.980256 4728 scope.go:117] "RemoveContainer" containerID="ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810" Dec 16 16:00:30 crc kubenswrapper[4728]: E1216 16:00:30.980507 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810\": container with ID starting with ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810 not found: ID does not exist" containerID="ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.980532 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810"} err="failed to get container status \"ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810\": rpc error: code = NotFound desc = could not find container \"ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810\": container with ID starting with ef0f71f25491aaff0699a2c39bbf25f4c99415f624c59c9eef22a1f96f231810 not found: ID does not exist" Dec 16 16:00:30 crc kubenswrapper[4728]: I1216 16:00:30.987093 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/745dd14c-6645-41dd-a220-c45f310237d4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "745dd14c-6645-41dd-a220-c45f310237d4" (UID: "745dd14c-6645-41dd-a220-c45f310237d4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:00:31 crc kubenswrapper[4728]: I1216 16:00:31.012937 4728 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/745dd14c-6645-41dd-a220-c45f310237d4-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 16 16:00:31 crc kubenswrapper[4728]: I1216 16:00:31.521181 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="745dd14c-6645-41dd-a220-c45f310237d4" path="/var/lib/kubelet/pods/745dd14c-6645-41dd-a220-c45f310237d4/volumes" Dec 16 16:00:38 crc kubenswrapper[4728]: I1216 16:00:38.818884 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:00:38 crc kubenswrapper[4728]: I1216 16:00:38.819530 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.451850 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4xvhh"] Dec 16 16:00:49 crc kubenswrapper[4728]: E1216 16:00:49.452891 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="745dd14c-6645-41dd-a220-c45f310237d4" containerName="copy" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.452911 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="745dd14c-6645-41dd-a220-c45f310237d4" containerName="copy" Dec 16 16:00:49 crc kubenswrapper[4728]: E1216 16:00:49.452944 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" containerName="registry-server" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.452955 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" containerName="registry-server" Dec 16 16:00:49 crc kubenswrapper[4728]: E1216 16:00:49.452975 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" containerName="extract-utilities" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.452984 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" containerName="extract-utilities" Dec 16 16:00:49 crc kubenswrapper[4728]: E1216 16:00:49.453002 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="745dd14c-6645-41dd-a220-c45f310237d4" containerName="gather" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.453009 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="745dd14c-6645-41dd-a220-c45f310237d4" containerName="gather" Dec 16 16:00:49 crc kubenswrapper[4728]: E1216 16:00:49.453029 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" containerName="extract-content" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.453037 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" containerName="extract-content" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.453246 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="745dd14c-6645-41dd-a220-c45f310237d4" containerName="gather" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.453267 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="745dd14c-6645-41dd-a220-c45f310237d4" containerName="copy" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.453334 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5750edfb-4af9-40ff-a8dd-9762c15ff51d" containerName="registry-server" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.455055 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.461250 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xvhh"] Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.580531 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-utilities\") pod \"redhat-operators-4xvhh\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.580757 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdqvg\" (UniqueName: \"kubernetes.io/projected/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-kube-api-access-mdqvg\") pod \"redhat-operators-4xvhh\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.581048 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-catalog-content\") pod \"redhat-operators-4xvhh\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.683170 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-catalog-content\") pod \"redhat-operators-4xvhh\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.683283 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-utilities\") pod \"redhat-operators-4xvhh\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.683366 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdqvg\" (UniqueName: \"kubernetes.io/projected/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-kube-api-access-mdqvg\") pod \"redhat-operators-4xvhh\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.683768 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-catalog-content\") pod \"redhat-operators-4xvhh\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.683778 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-utilities\") pod \"redhat-operators-4xvhh\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.714360 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdqvg\" (UniqueName: \"kubernetes.io/projected/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-kube-api-access-mdqvg\") pod \"redhat-operators-4xvhh\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:49 crc kubenswrapper[4728]: I1216 16:00:49.777914 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:50 crc kubenswrapper[4728]: I1216 16:00:50.223900 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xvhh"] Dec 16 16:00:51 crc kubenswrapper[4728]: I1216 16:00:51.105676 4728 generic.go:334] "Generic (PLEG): container finished" podID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerID="6637498a277d2d7f4963648b0b997c6a24838cb8b540e813524e000f05e61228" exitCode=0 Dec 16 16:00:51 crc kubenswrapper[4728]: I1216 16:00:51.105716 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xvhh" event={"ID":"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1","Type":"ContainerDied","Data":"6637498a277d2d7f4963648b0b997c6a24838cb8b540e813524e000f05e61228"} Dec 16 16:00:51 crc kubenswrapper[4728]: I1216 16:00:51.105737 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xvhh" event={"ID":"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1","Type":"ContainerStarted","Data":"5373c0fc075f15b36dab8a5eb87f9e8de86b1e3ebdff005c8a4690c224a93b72"} Dec 16 16:00:51 crc kubenswrapper[4728]: I1216 16:00:51.107964 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 16:00:52 crc kubenswrapper[4728]: I1216 16:00:52.119702 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xvhh" event={"ID":"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1","Type":"ContainerStarted","Data":"7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08"} Dec 16 16:00:54 crc kubenswrapper[4728]: I1216 16:00:54.142465 4728 generic.go:334] "Generic (PLEG): container finished" podID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerID="7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08" exitCode=0 Dec 16 16:00:54 crc kubenswrapper[4728]: I1216 16:00:54.142538 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xvhh" event={"ID":"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1","Type":"ContainerDied","Data":"7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08"} Dec 16 16:00:55 crc kubenswrapper[4728]: I1216 16:00:55.154030 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xvhh" event={"ID":"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1","Type":"ContainerStarted","Data":"800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7"} Dec 16 16:00:55 crc kubenswrapper[4728]: I1216 16:00:55.174051 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4xvhh" podStartSLOduration=2.617249998 podStartE2EDuration="6.174029994s" podCreationTimestamp="2025-12-16 16:00:49 +0000 UTC" firstStartedPulling="2025-12-16 16:00:51.107750629 +0000 UTC m=+3831.947929613" lastFinishedPulling="2025-12-16 16:00:54.664530585 +0000 UTC m=+3835.504709609" observedRunningTime="2025-12-16 16:00:55.169064729 +0000 UTC m=+3836.009243713" watchObservedRunningTime="2025-12-16 16:00:55.174029994 +0000 UTC m=+3836.014208978" Dec 16 16:00:59 crc kubenswrapper[4728]: I1216 16:00:59.779040 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:00:59 crc kubenswrapper[4728]: I1216 16:00:59.780896 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.149562 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29431681-dm5sf"] Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.151652 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.162705 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29431681-dm5sf"] Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.276911 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h874b\" (UniqueName: \"kubernetes.io/projected/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-kube-api-access-h874b\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.276994 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-combined-ca-bundle\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.277098 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-fernet-keys\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.277238 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-config-data\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.379286 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-fernet-keys\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.379435 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-config-data\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.379483 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h874b\" (UniqueName: \"kubernetes.io/projected/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-kube-api-access-h874b\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.379506 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-combined-ca-bundle\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.386981 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-combined-ca-bundle\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.387279 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-fernet-keys\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.389823 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-config-data\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.409242 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h874b\" (UniqueName: \"kubernetes.io/projected/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-kube-api-access-h874b\") pod \"keystone-cron-29431681-dm5sf\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.472499 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.847024 4728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4xvhh" podUID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerName="registry-server" probeResult="failure" output=< Dec 16 16:01:00 crc kubenswrapper[4728]: timeout: failed to connect service ":50051" within 1s Dec 16 16:01:00 crc kubenswrapper[4728]: > Dec 16 16:01:00 crc kubenswrapper[4728]: I1216 16:01:00.987250 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29431681-dm5sf"] Dec 16 16:01:01 crc kubenswrapper[4728]: I1216 16:01:01.214183 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431681-dm5sf" event={"ID":"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4","Type":"ContainerStarted","Data":"59b0af0539ea387318295172a5d5ca0a010f661a9bf54a02982d193b57a9ecda"} Dec 16 16:01:01 crc kubenswrapper[4728]: I1216 16:01:01.214573 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431681-dm5sf" event={"ID":"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4","Type":"ContainerStarted","Data":"e08fa03e36d102547df07e9513ccd333fe03db25c2de8416e551308ec1154568"} Dec 16 16:01:01 crc kubenswrapper[4728]: I1216 16:01:01.242132 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29431681-dm5sf" podStartSLOduration=1.242109554 podStartE2EDuration="1.242109554s" podCreationTimestamp="2025-12-16 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:01:01.234317533 +0000 UTC m=+3842.074496527" watchObservedRunningTime="2025-12-16 16:01:01.242109554 +0000 UTC m=+3842.082288558" Dec 16 16:01:04 crc kubenswrapper[4728]: I1216 16:01:04.250934 4728 generic.go:334] "Generic (PLEG): container finished" podID="5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4" containerID="59b0af0539ea387318295172a5d5ca0a010f661a9bf54a02982d193b57a9ecda" exitCode=0 Dec 16 16:01:04 crc kubenswrapper[4728]: I1216 16:01:04.251762 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431681-dm5sf" event={"ID":"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4","Type":"ContainerDied","Data":"59b0af0539ea387318295172a5d5ca0a010f661a9bf54a02982d193b57a9ecda"} Dec 16 16:01:05 crc kubenswrapper[4728]: I1216 16:01:05.840224 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:05 crc kubenswrapper[4728]: I1216 16:01:05.912706 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-combined-ca-bundle\") pod \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " Dec 16 16:01:05 crc kubenswrapper[4728]: I1216 16:01:05.913198 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h874b\" (UniqueName: \"kubernetes.io/projected/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-kube-api-access-h874b\") pod \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " Dec 16 16:01:05 crc kubenswrapper[4728]: I1216 16:01:05.913260 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-fernet-keys\") pod \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " Dec 16 16:01:05 crc kubenswrapper[4728]: I1216 16:01:05.913329 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-config-data\") pod \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\" (UID: \"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4\") " Dec 16 16:01:05 crc kubenswrapper[4728]: I1216 16:01:05.919443 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-kube-api-access-h874b" (OuterVolumeSpecName: "kube-api-access-h874b") pod "5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4" (UID: "5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4"). InnerVolumeSpecName "kube-api-access-h874b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:01:05 crc kubenswrapper[4728]: I1216 16:01:05.920246 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4" (UID: "5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 16:01:05 crc kubenswrapper[4728]: I1216 16:01:05.947016 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4" (UID: "5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 16:01:05 crc kubenswrapper[4728]: I1216 16:01:05.965928 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-config-data" (OuterVolumeSpecName: "config-data") pod "5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4" (UID: "5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 16 16:01:06 crc kubenswrapper[4728]: I1216 16:01:06.016397 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h874b\" (UniqueName: \"kubernetes.io/projected/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-kube-api-access-h874b\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:06 crc kubenswrapper[4728]: I1216 16:01:06.016447 4728 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:06 crc kubenswrapper[4728]: I1216 16:01:06.016465 4728 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-config-data\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:06 crc kubenswrapper[4728]: I1216 16:01:06.016490 4728 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:06 crc kubenswrapper[4728]: I1216 16:01:06.275716 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29431681-dm5sf" event={"ID":"5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4","Type":"ContainerDied","Data":"e08fa03e36d102547df07e9513ccd333fe03db25c2de8416e551308ec1154568"} Dec 16 16:01:06 crc kubenswrapper[4728]: I1216 16:01:06.275757 4728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e08fa03e36d102547df07e9513ccd333fe03db25c2de8416e551308ec1154568" Dec 16 16:01:06 crc kubenswrapper[4728]: I1216 16:01:06.275813 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29431681-dm5sf" Dec 16 16:01:08 crc kubenswrapper[4728]: I1216 16:01:08.819199 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:01:08 crc kubenswrapper[4728]: I1216 16:01:08.819722 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:01:08 crc kubenswrapper[4728]: I1216 16:01:08.819807 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 16:01:08 crc kubenswrapper[4728]: I1216 16:01:08.820678 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 16:01:08 crc kubenswrapper[4728]: I1216 16:01:08.820754 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" gracePeriod=600 Dec 16 16:01:08 crc kubenswrapper[4728]: E1216 16:01:08.957234 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:01:09 crc kubenswrapper[4728]: I1216 16:01:09.306484 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" exitCode=0 Dec 16 16:01:09 crc kubenswrapper[4728]: I1216 16:01:09.306536 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d"} Dec 16 16:01:09 crc kubenswrapper[4728]: I1216 16:01:09.306568 4728 scope.go:117] "RemoveContainer" containerID="6057ec8bf018ddc636f1fdb8a788ff08f7c76a9cc99e3eea7068955f5d92eec4" Dec 16 16:01:09 crc kubenswrapper[4728]: I1216 16:01:09.307393 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:01:09 crc kubenswrapper[4728]: E1216 16:01:09.308731 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:01:09 crc kubenswrapper[4728]: I1216 16:01:09.842905 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:01:09 crc kubenswrapper[4728]: I1216 16:01:09.888200 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:01:10 crc kubenswrapper[4728]: I1216 16:01:10.082181 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xvhh"] Dec 16 16:01:11 crc kubenswrapper[4728]: I1216 16:01:11.328266 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4xvhh" podUID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerName="registry-server" containerID="cri-o://800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7" gracePeriod=2 Dec 16 16:01:11 crc kubenswrapper[4728]: I1216 16:01:11.928108 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.033313 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-utilities\") pod \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.033465 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-catalog-content\") pod \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.033548 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdqvg\" (UniqueName: \"kubernetes.io/projected/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-kube-api-access-mdqvg\") pod \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\" (UID: \"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1\") " Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.035156 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-utilities" (OuterVolumeSpecName: "utilities") pod "c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" (UID: "c3bcc0b1-0712-4cc7-841c-db535fa5c6a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.042779 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-kube-api-access-mdqvg" (OuterVolumeSpecName: "kube-api-access-mdqvg") pod "c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" (UID: "c3bcc0b1-0712-4cc7-841c-db535fa5c6a1"). InnerVolumeSpecName "kube-api-access-mdqvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.135996 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdqvg\" (UniqueName: \"kubernetes.io/projected/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-kube-api-access-mdqvg\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.136307 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.146784 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" (UID: "c3bcc0b1-0712-4cc7-841c-db535fa5c6a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.238605 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.338859 4728 generic.go:334] "Generic (PLEG): container finished" podID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerID="800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7" exitCode=0 Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.338903 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xvhh" event={"ID":"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1","Type":"ContainerDied","Data":"800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7"} Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.338934 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xvhh" event={"ID":"c3bcc0b1-0712-4cc7-841c-db535fa5c6a1","Type":"ContainerDied","Data":"5373c0fc075f15b36dab8a5eb87f9e8de86b1e3ebdff005c8a4690c224a93b72"} Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.338949 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xvhh" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.338963 4728 scope.go:117] "RemoveContainer" containerID="800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.362634 4728 scope.go:117] "RemoveContainer" containerID="7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.372890 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xvhh"] Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.385078 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4xvhh"] Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.399265 4728 scope.go:117] "RemoveContainer" containerID="6637498a277d2d7f4963648b0b997c6a24838cb8b540e813524e000f05e61228" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.429470 4728 scope.go:117] "RemoveContainer" containerID="800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7" Dec 16 16:01:12 crc kubenswrapper[4728]: E1216 16:01:12.430062 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7\": container with ID starting with 800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7 not found: ID does not exist" containerID="800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.430095 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7"} err="failed to get container status \"800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7\": rpc error: code = NotFound desc = could not find container \"800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7\": container with ID starting with 800e85b2cabc41f3f2e8afe79572e4d49135f2501880bdf96d8e1e03c7ab38b7 not found: ID does not exist" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.430117 4728 scope.go:117] "RemoveContainer" containerID="7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08" Dec 16 16:01:12 crc kubenswrapper[4728]: E1216 16:01:12.430548 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08\": container with ID starting with 7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08 not found: ID does not exist" containerID="7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.430576 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08"} err="failed to get container status \"7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08\": rpc error: code = NotFound desc = could not find container \"7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08\": container with ID starting with 7dd23c063a25d218548bc82074448017540bc95bf4db3d30c5a1ead00b2bdf08 not found: ID does not exist" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.430596 4728 scope.go:117] "RemoveContainer" containerID="6637498a277d2d7f4963648b0b997c6a24838cb8b540e813524e000f05e61228" Dec 16 16:01:12 crc kubenswrapper[4728]: E1216 16:01:12.430964 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6637498a277d2d7f4963648b0b997c6a24838cb8b540e813524e000f05e61228\": container with ID starting with 6637498a277d2d7f4963648b0b997c6a24838cb8b540e813524e000f05e61228 not found: ID does not exist" containerID="6637498a277d2d7f4963648b0b997c6a24838cb8b540e813524e000f05e61228" Dec 16 16:01:12 crc kubenswrapper[4728]: I1216 16:01:12.430987 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6637498a277d2d7f4963648b0b997c6a24838cb8b540e813524e000f05e61228"} err="failed to get container status \"6637498a277d2d7f4963648b0b997c6a24838cb8b540e813524e000f05e61228\": rpc error: code = NotFound desc = could not find container \"6637498a277d2d7f4963648b0b997c6a24838cb8b540e813524e000f05e61228\": container with ID starting with 6637498a277d2d7f4963648b0b997c6a24838cb8b540e813524e000f05e61228 not found: ID does not exist" Dec 16 16:01:13 crc kubenswrapper[4728]: I1216 16:01:13.550210 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" path="/var/lib/kubelet/pods/c3bcc0b1-0712-4cc7-841c-db535fa5c6a1/volumes" Dec 16 16:01:24 crc kubenswrapper[4728]: I1216 16:01:24.507023 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:01:24 crc kubenswrapper[4728]: E1216 16:01:24.507943 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:01:36 crc kubenswrapper[4728]: I1216 16:01:36.506977 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:01:36 crc kubenswrapper[4728]: E1216 16:01:36.508318 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:01:49 crc kubenswrapper[4728]: I1216 16:01:49.519784 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:01:49 crc kubenswrapper[4728]: E1216 16:01:49.520808 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:02:02 crc kubenswrapper[4728]: I1216 16:02:02.507327 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:02:02 crc kubenswrapper[4728]: E1216 16:02:02.508056 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:02:14 crc kubenswrapper[4728]: I1216 16:02:14.506113 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:02:14 crc kubenswrapper[4728]: E1216 16:02:14.506892 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:02:19 crc kubenswrapper[4728]: I1216 16:02:19.354532 4728 scope.go:117] "RemoveContainer" containerID="856234ebb91f2c39c79290797a29772648b2ec5369579d5ffde06566c367b021" Dec 16 16:02:27 crc kubenswrapper[4728]: I1216 16:02:27.506912 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:02:27 crc kubenswrapper[4728]: E1216 16:02:27.508770 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:02:42 crc kubenswrapper[4728]: I1216 16:02:42.506882 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:02:42 crc kubenswrapper[4728]: E1216 16:02:42.507794 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:02:54 crc kubenswrapper[4728]: I1216 16:02:54.508030 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:02:54 crc kubenswrapper[4728]: E1216 16:02:54.509195 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:03:09 crc kubenswrapper[4728]: I1216 16:03:09.515484 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:03:09 crc kubenswrapper[4728]: E1216 16:03:09.516483 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.937500 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g7mmv/must-gather-vxp82"] Dec 16 16:03:17 crc kubenswrapper[4728]: E1216 16:03:17.938354 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4" containerName="keystone-cron" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.938367 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4" containerName="keystone-cron" Dec 16 16:03:17 crc kubenswrapper[4728]: E1216 16:03:17.938381 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerName="registry-server" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.938388 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerName="registry-server" Dec 16 16:03:17 crc kubenswrapper[4728]: E1216 16:03:17.938414 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerName="extract-content" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.938422 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerName="extract-content" Dec 16 16:03:17 crc kubenswrapper[4728]: E1216 16:03:17.938441 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerName="extract-utilities" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.938447 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerName="extract-utilities" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.938628 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bcc0b1-0712-4cc7-841c-db535fa5c6a1" containerName="registry-server" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.938647 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4" containerName="keystone-cron" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.939577 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/must-gather-vxp82" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.946950 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-g7mmv"/"kube-root-ca.crt" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.947187 4728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-g7mmv"/"default-dockercfg-m4blc" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.947328 4728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-g7mmv"/"openshift-service-ca.crt" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.950466 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/11afde6c-0804-4567-bfca-9495c69e47c1-must-gather-output\") pod \"must-gather-vxp82\" (UID: \"11afde6c-0804-4567-bfca-9495c69e47c1\") " pod="openshift-must-gather-g7mmv/must-gather-vxp82" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.950516 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbvg6\" (UniqueName: \"kubernetes.io/projected/11afde6c-0804-4567-bfca-9495c69e47c1-kube-api-access-fbvg6\") pod \"must-gather-vxp82\" (UID: \"11afde6c-0804-4567-bfca-9495c69e47c1\") " pod="openshift-must-gather-g7mmv/must-gather-vxp82" Dec 16 16:03:17 crc kubenswrapper[4728]: I1216 16:03:17.959177 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g7mmv/must-gather-vxp82"] Dec 16 16:03:18 crc kubenswrapper[4728]: I1216 16:03:18.052690 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/11afde6c-0804-4567-bfca-9495c69e47c1-must-gather-output\") pod \"must-gather-vxp82\" (UID: \"11afde6c-0804-4567-bfca-9495c69e47c1\") " pod="openshift-must-gather-g7mmv/must-gather-vxp82" Dec 16 16:03:18 crc kubenswrapper[4728]: I1216 16:03:18.052764 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbvg6\" (UniqueName: \"kubernetes.io/projected/11afde6c-0804-4567-bfca-9495c69e47c1-kube-api-access-fbvg6\") pod \"must-gather-vxp82\" (UID: \"11afde6c-0804-4567-bfca-9495c69e47c1\") " pod="openshift-must-gather-g7mmv/must-gather-vxp82" Dec 16 16:03:18 crc kubenswrapper[4728]: I1216 16:03:18.053208 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/11afde6c-0804-4567-bfca-9495c69e47c1-must-gather-output\") pod \"must-gather-vxp82\" (UID: \"11afde6c-0804-4567-bfca-9495c69e47c1\") " pod="openshift-must-gather-g7mmv/must-gather-vxp82" Dec 16 16:03:18 crc kubenswrapper[4728]: I1216 16:03:18.079443 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbvg6\" (UniqueName: \"kubernetes.io/projected/11afde6c-0804-4567-bfca-9495c69e47c1-kube-api-access-fbvg6\") pod \"must-gather-vxp82\" (UID: \"11afde6c-0804-4567-bfca-9495c69e47c1\") " pod="openshift-must-gather-g7mmv/must-gather-vxp82" Dec 16 16:03:18 crc kubenswrapper[4728]: I1216 16:03:18.267846 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/must-gather-vxp82" Dec 16 16:03:18 crc kubenswrapper[4728]: I1216 16:03:18.737646 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g7mmv/must-gather-vxp82"] Dec 16 16:03:18 crc kubenswrapper[4728]: W1216 16:03:18.749822 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11afde6c_0804_4567_bfca_9495c69e47c1.slice/crio-81a25ea5af9980ed779a38bd83f09fb2ac84758d9842fb94f418f305a6dfdf7e WatchSource:0}: Error finding container 81a25ea5af9980ed779a38bd83f09fb2ac84758d9842fb94f418f305a6dfdf7e: Status 404 returned error can't find the container with id 81a25ea5af9980ed779a38bd83f09fb2ac84758d9842fb94f418f305a6dfdf7e Dec 16 16:03:19 crc kubenswrapper[4728]: I1216 16:03:19.436592 4728 scope.go:117] "RemoveContainer" containerID="c0a5d4926faa61d63e8b2631d3ca0e1af22d01ce26fca2c8e11c5fc68d2ea169" Dec 16 16:03:19 crc kubenswrapper[4728]: I1216 16:03:19.587130 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7mmv/must-gather-vxp82" event={"ID":"11afde6c-0804-4567-bfca-9495c69e47c1","Type":"ContainerStarted","Data":"edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9"} Dec 16 16:03:19 crc kubenswrapper[4728]: I1216 16:03:19.587735 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7mmv/must-gather-vxp82" event={"ID":"11afde6c-0804-4567-bfca-9495c69e47c1","Type":"ContainerStarted","Data":"45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8"} Dec 16 16:03:19 crc kubenswrapper[4728]: I1216 16:03:19.587770 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7mmv/must-gather-vxp82" event={"ID":"11afde6c-0804-4567-bfca-9495c69e47c1","Type":"ContainerStarted","Data":"81a25ea5af9980ed779a38bd83f09fb2ac84758d9842fb94f418f305a6dfdf7e"} Dec 16 16:03:19 crc kubenswrapper[4728]: I1216 16:03:19.618107 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g7mmv/must-gather-vxp82" podStartSLOduration=2.6180913 podStartE2EDuration="2.6180913s" podCreationTimestamp="2025-12-16 16:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:03:19.609851828 +0000 UTC m=+3980.450030812" watchObservedRunningTime="2025-12-16 16:03:19.6180913 +0000 UTC m=+3980.458270284" Dec 16 16:03:21 crc kubenswrapper[4728]: E1216 16:03:21.817626 4728 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.210:57300->38.102.83.210:34353: read tcp 38.102.83.210:57300->38.102.83.210:34353: read: connection reset by peer Dec 16 16:03:22 crc kubenswrapper[4728]: I1216 16:03:22.688149 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g7mmv/crc-debug-jvt5p"] Dec 16 16:03:22 crc kubenswrapper[4728]: I1216 16:03:22.691468 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" Dec 16 16:03:22 crc kubenswrapper[4728]: I1216 16:03:22.841089 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf9zt\" (UniqueName: \"kubernetes.io/projected/66946ce8-3548-442a-971a-fdb024c0afb1-kube-api-access-qf9zt\") pod \"crc-debug-jvt5p\" (UID: \"66946ce8-3548-442a-971a-fdb024c0afb1\") " pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" Dec 16 16:03:22 crc kubenswrapper[4728]: I1216 16:03:22.841491 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66946ce8-3548-442a-971a-fdb024c0afb1-host\") pod \"crc-debug-jvt5p\" (UID: \"66946ce8-3548-442a-971a-fdb024c0afb1\") " pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" Dec 16 16:03:22 crc kubenswrapper[4728]: I1216 16:03:22.943479 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66946ce8-3548-442a-971a-fdb024c0afb1-host\") pod \"crc-debug-jvt5p\" (UID: \"66946ce8-3548-442a-971a-fdb024c0afb1\") " pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" Dec 16 16:03:22 crc kubenswrapper[4728]: I1216 16:03:22.943693 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf9zt\" (UniqueName: \"kubernetes.io/projected/66946ce8-3548-442a-971a-fdb024c0afb1-kube-api-access-qf9zt\") pod \"crc-debug-jvt5p\" (UID: \"66946ce8-3548-442a-971a-fdb024c0afb1\") " pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" Dec 16 16:03:22 crc kubenswrapper[4728]: I1216 16:03:22.944281 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66946ce8-3548-442a-971a-fdb024c0afb1-host\") pod \"crc-debug-jvt5p\" (UID: \"66946ce8-3548-442a-971a-fdb024c0afb1\") " pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" Dec 16 16:03:22 crc kubenswrapper[4728]: I1216 16:03:22.963329 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf9zt\" (UniqueName: \"kubernetes.io/projected/66946ce8-3548-442a-971a-fdb024c0afb1-kube-api-access-qf9zt\") pod \"crc-debug-jvt5p\" (UID: \"66946ce8-3548-442a-971a-fdb024c0afb1\") " pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" Dec 16 16:03:23 crc kubenswrapper[4728]: I1216 16:03:23.017966 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" Dec 16 16:03:23 crc kubenswrapper[4728]: I1216 16:03:23.626475 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" event={"ID":"66946ce8-3548-442a-971a-fdb024c0afb1","Type":"ContainerStarted","Data":"0139c51e78987fbad9ae29b16b84455084521f0c662a176cdcf6976a932d2d80"} Dec 16 16:03:23 crc kubenswrapper[4728]: I1216 16:03:23.627090 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" event={"ID":"66946ce8-3548-442a-971a-fdb024c0afb1","Type":"ContainerStarted","Data":"999a3863a95475bfdfcdd339b57ab2c71c9be89f5c3324d9f73f3cfc37b62512"} Dec 16 16:03:23 crc kubenswrapper[4728]: I1216 16:03:23.645377 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" podStartSLOduration=1.645360518 podStartE2EDuration="1.645360518s" podCreationTimestamp="2025-12-16 16:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:03:23.641195876 +0000 UTC m=+3984.481374860" watchObservedRunningTime="2025-12-16 16:03:23.645360518 +0000 UTC m=+3984.485539502" Dec 16 16:03:24 crc kubenswrapper[4728]: I1216 16:03:24.506824 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:03:24 crc kubenswrapper[4728]: E1216 16:03:24.507427 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:03:36 crc kubenswrapper[4728]: I1216 16:03:36.507066 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:03:36 crc kubenswrapper[4728]: E1216 16:03:36.507985 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:03:50 crc kubenswrapper[4728]: I1216 16:03:50.507068 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:03:50 crc kubenswrapper[4728]: E1216 16:03:50.507753 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:03:57 crc kubenswrapper[4728]: I1216 16:03:57.959101 4728 generic.go:334] "Generic (PLEG): container finished" podID="66946ce8-3548-442a-971a-fdb024c0afb1" containerID="0139c51e78987fbad9ae29b16b84455084521f0c662a176cdcf6976a932d2d80" exitCode=0 Dec 16 16:03:57 crc kubenswrapper[4728]: I1216 16:03:57.959188 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" event={"ID":"66946ce8-3548-442a-971a-fdb024c0afb1","Type":"ContainerDied","Data":"0139c51e78987fbad9ae29b16b84455084521f0c662a176cdcf6976a932d2d80"} Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.085706 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.141731 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g7mmv/crc-debug-jvt5p"] Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.150558 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g7mmv/crc-debug-jvt5p"] Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.281031 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66946ce8-3548-442a-971a-fdb024c0afb1-host\") pod \"66946ce8-3548-442a-971a-fdb024c0afb1\" (UID: \"66946ce8-3548-442a-971a-fdb024c0afb1\") " Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.281095 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66946ce8-3548-442a-971a-fdb024c0afb1-host" (OuterVolumeSpecName: "host") pod "66946ce8-3548-442a-971a-fdb024c0afb1" (UID: "66946ce8-3548-442a-971a-fdb024c0afb1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.281118 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf9zt\" (UniqueName: \"kubernetes.io/projected/66946ce8-3548-442a-971a-fdb024c0afb1-kube-api-access-qf9zt\") pod \"66946ce8-3548-442a-971a-fdb024c0afb1\" (UID: \"66946ce8-3548-442a-971a-fdb024c0afb1\") " Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.281971 4728 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66946ce8-3548-442a-971a-fdb024c0afb1-host\") on node \"crc\" DevicePath \"\"" Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.288373 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66946ce8-3548-442a-971a-fdb024c0afb1-kube-api-access-qf9zt" (OuterVolumeSpecName: "kube-api-access-qf9zt") pod "66946ce8-3548-442a-971a-fdb024c0afb1" (UID: "66946ce8-3548-442a-971a-fdb024c0afb1"). InnerVolumeSpecName "kube-api-access-qf9zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.384220 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf9zt\" (UniqueName: \"kubernetes.io/projected/66946ce8-3548-442a-971a-fdb024c0afb1-kube-api-access-qf9zt\") on node \"crc\" DevicePath \"\"" Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.526361 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66946ce8-3548-442a-971a-fdb024c0afb1" path="/var/lib/kubelet/pods/66946ce8-3548-442a-971a-fdb024c0afb1/volumes" Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.982117 4728 scope.go:117] "RemoveContainer" containerID="0139c51e78987fbad9ae29b16b84455084521f0c662a176cdcf6976a932d2d80" Dec 16 16:03:59 crc kubenswrapper[4728]: I1216 16:03:59.982259 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-jvt5p" Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.313016 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g7mmv/crc-debug-66c77"] Dec 16 16:04:00 crc kubenswrapper[4728]: E1216 16:04:00.313461 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66946ce8-3548-442a-971a-fdb024c0afb1" containerName="container-00" Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.313476 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="66946ce8-3548-442a-971a-fdb024c0afb1" containerName="container-00" Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.313732 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="66946ce8-3548-442a-971a-fdb024c0afb1" containerName="container-00" Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.316817 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-66c77" Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.505074 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c09f865-340f-478c-998b-5e6606d58cba-host\") pod \"crc-debug-66c77\" (UID: \"9c09f865-340f-478c-998b-5e6606d58cba\") " pod="openshift-must-gather-g7mmv/crc-debug-66c77" Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.505126 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzxpk\" (UniqueName: \"kubernetes.io/projected/9c09f865-340f-478c-998b-5e6606d58cba-kube-api-access-nzxpk\") pod \"crc-debug-66c77\" (UID: \"9c09f865-340f-478c-998b-5e6606d58cba\") " pod="openshift-must-gather-g7mmv/crc-debug-66c77" Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.607023 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c09f865-340f-478c-998b-5e6606d58cba-host\") pod \"crc-debug-66c77\" (UID: \"9c09f865-340f-478c-998b-5e6606d58cba\") " pod="openshift-must-gather-g7mmv/crc-debug-66c77" Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.607119 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzxpk\" (UniqueName: \"kubernetes.io/projected/9c09f865-340f-478c-998b-5e6606d58cba-kube-api-access-nzxpk\") pod \"crc-debug-66c77\" (UID: \"9c09f865-340f-478c-998b-5e6606d58cba\") " pod="openshift-must-gather-g7mmv/crc-debug-66c77" Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.608649 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c09f865-340f-478c-998b-5e6606d58cba-host\") pod \"crc-debug-66c77\" (UID: \"9c09f865-340f-478c-998b-5e6606d58cba\") " pod="openshift-must-gather-g7mmv/crc-debug-66c77" Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.647499 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzxpk\" (UniqueName: \"kubernetes.io/projected/9c09f865-340f-478c-998b-5e6606d58cba-kube-api-access-nzxpk\") pod \"crc-debug-66c77\" (UID: \"9c09f865-340f-478c-998b-5e6606d58cba\") " pod="openshift-must-gather-g7mmv/crc-debug-66c77" Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.933494 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-66c77" Dec 16 16:04:00 crc kubenswrapper[4728]: W1216 16:04:00.972969 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c09f865_340f_478c_998b_5e6606d58cba.slice/crio-f5ed7a9249adcd361e5731ab9dffc45b277a687374fccc4d53d1acd9be5a2c2f WatchSource:0}: Error finding container f5ed7a9249adcd361e5731ab9dffc45b277a687374fccc4d53d1acd9be5a2c2f: Status 404 returned error can't find the container with id f5ed7a9249adcd361e5731ab9dffc45b277a687374fccc4d53d1acd9be5a2c2f Dec 16 16:04:00 crc kubenswrapper[4728]: I1216 16:04:00.998578 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7mmv/crc-debug-66c77" event={"ID":"9c09f865-340f-478c-998b-5e6606d58cba","Type":"ContainerStarted","Data":"f5ed7a9249adcd361e5731ab9dffc45b277a687374fccc4d53d1acd9be5a2c2f"} Dec 16 16:04:02 crc kubenswrapper[4728]: I1216 16:04:02.008216 4728 generic.go:334] "Generic (PLEG): container finished" podID="9c09f865-340f-478c-998b-5e6606d58cba" containerID="70e24d4586109db233d552655da2df5e9392938c76d02a5cf02ad84444cb0fe2" exitCode=0 Dec 16 16:04:02 crc kubenswrapper[4728]: I1216 16:04:02.008577 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7mmv/crc-debug-66c77" event={"ID":"9c09f865-340f-478c-998b-5e6606d58cba","Type":"ContainerDied","Data":"70e24d4586109db233d552655da2df5e9392938c76d02a5cf02ad84444cb0fe2"} Dec 16 16:04:02 crc kubenswrapper[4728]: I1216 16:04:02.507596 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:04:02 crc kubenswrapper[4728]: E1216 16:04:02.511192 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:04:02 crc kubenswrapper[4728]: I1216 16:04:02.612075 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g7mmv/crc-debug-66c77"] Dec 16 16:04:02 crc kubenswrapper[4728]: I1216 16:04:02.620294 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g7mmv/crc-debug-66c77"] Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.104927 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-66c77" Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.256470 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzxpk\" (UniqueName: \"kubernetes.io/projected/9c09f865-340f-478c-998b-5e6606d58cba-kube-api-access-nzxpk\") pod \"9c09f865-340f-478c-998b-5e6606d58cba\" (UID: \"9c09f865-340f-478c-998b-5e6606d58cba\") " Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.256568 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c09f865-340f-478c-998b-5e6606d58cba-host\") pod \"9c09f865-340f-478c-998b-5e6606d58cba\" (UID: \"9c09f865-340f-478c-998b-5e6606d58cba\") " Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.256796 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c09f865-340f-478c-998b-5e6606d58cba-host" (OuterVolumeSpecName: "host") pod "9c09f865-340f-478c-998b-5e6606d58cba" (UID: "9c09f865-340f-478c-998b-5e6606d58cba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.257058 4728 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c09f865-340f-478c-998b-5e6606d58cba-host\") on node \"crc\" DevicePath \"\"" Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.266098 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c09f865-340f-478c-998b-5e6606d58cba-kube-api-access-nzxpk" (OuterVolumeSpecName: "kube-api-access-nzxpk") pod "9c09f865-340f-478c-998b-5e6606d58cba" (UID: "9c09f865-340f-478c-998b-5e6606d58cba"). InnerVolumeSpecName "kube-api-access-nzxpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.358636 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzxpk\" (UniqueName: \"kubernetes.io/projected/9c09f865-340f-478c-998b-5e6606d58cba-kube-api-access-nzxpk\") on node \"crc\" DevicePath \"\"" Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.516583 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c09f865-340f-478c-998b-5e6606d58cba" path="/var/lib/kubelet/pods/9c09f865-340f-478c-998b-5e6606d58cba/volumes" Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.788910 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g7mmv/crc-debug-9t56n"] Dec 16 16:04:03 crc kubenswrapper[4728]: E1216 16:04:03.790118 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c09f865-340f-478c-998b-5e6606d58cba" containerName="container-00" Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.790145 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c09f865-340f-478c-998b-5e6606d58cba" containerName="container-00" Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.790373 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c09f865-340f-478c-998b-5e6606d58cba" containerName="container-00" Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.791059 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-9t56n" Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.968149 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8wrx\" (UniqueName: \"kubernetes.io/projected/2d222b9e-2038-4713-bfef-5a70ba3a4d78-kube-api-access-c8wrx\") pod \"crc-debug-9t56n\" (UID: \"2d222b9e-2038-4713-bfef-5a70ba3a4d78\") " pod="openshift-must-gather-g7mmv/crc-debug-9t56n" Dec 16 16:04:03 crc kubenswrapper[4728]: I1216 16:04:03.968610 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d222b9e-2038-4713-bfef-5a70ba3a4d78-host\") pod \"crc-debug-9t56n\" (UID: \"2d222b9e-2038-4713-bfef-5a70ba3a4d78\") " pod="openshift-must-gather-g7mmv/crc-debug-9t56n" Dec 16 16:04:04 crc kubenswrapper[4728]: I1216 16:04:04.024333 4728 scope.go:117] "RemoveContainer" containerID="70e24d4586109db233d552655da2df5e9392938c76d02a5cf02ad84444cb0fe2" Dec 16 16:04:04 crc kubenswrapper[4728]: I1216 16:04:04.024364 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-66c77" Dec 16 16:04:04 crc kubenswrapper[4728]: I1216 16:04:04.074089 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d222b9e-2038-4713-bfef-5a70ba3a4d78-host\") pod \"crc-debug-9t56n\" (UID: \"2d222b9e-2038-4713-bfef-5a70ba3a4d78\") " pod="openshift-must-gather-g7mmv/crc-debug-9t56n" Dec 16 16:04:04 crc kubenswrapper[4728]: I1216 16:04:04.074200 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8wrx\" (UniqueName: \"kubernetes.io/projected/2d222b9e-2038-4713-bfef-5a70ba3a4d78-kube-api-access-c8wrx\") pod \"crc-debug-9t56n\" (UID: \"2d222b9e-2038-4713-bfef-5a70ba3a4d78\") " pod="openshift-must-gather-g7mmv/crc-debug-9t56n" Dec 16 16:04:04 crc kubenswrapper[4728]: I1216 16:04:04.074239 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d222b9e-2038-4713-bfef-5a70ba3a4d78-host\") pod \"crc-debug-9t56n\" (UID: \"2d222b9e-2038-4713-bfef-5a70ba3a4d78\") " pod="openshift-must-gather-g7mmv/crc-debug-9t56n" Dec 16 16:04:04 crc kubenswrapper[4728]: I1216 16:04:04.091561 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8wrx\" (UniqueName: \"kubernetes.io/projected/2d222b9e-2038-4713-bfef-5a70ba3a4d78-kube-api-access-c8wrx\") pod \"crc-debug-9t56n\" (UID: \"2d222b9e-2038-4713-bfef-5a70ba3a4d78\") " pod="openshift-must-gather-g7mmv/crc-debug-9t56n" Dec 16 16:04:04 crc kubenswrapper[4728]: I1216 16:04:04.105208 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-9t56n" Dec 16 16:04:04 crc kubenswrapper[4728]: W1216 16:04:04.132100 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d222b9e_2038_4713_bfef_5a70ba3a4d78.slice/crio-2eb5f5901d254296e7ee499223e75d2e4563b034f94596df9d5da32de8e9c4ac WatchSource:0}: Error finding container 2eb5f5901d254296e7ee499223e75d2e4563b034f94596df9d5da32de8e9c4ac: Status 404 returned error can't find the container with id 2eb5f5901d254296e7ee499223e75d2e4563b034f94596df9d5da32de8e9c4ac Dec 16 16:04:05 crc kubenswrapper[4728]: I1216 16:04:05.035637 4728 generic.go:334] "Generic (PLEG): container finished" podID="2d222b9e-2038-4713-bfef-5a70ba3a4d78" containerID="223ff96cf6c0af006ee9c6357f5fe9bebb1159814df7f2ffe689d72635801929" exitCode=0 Dec 16 16:04:05 crc kubenswrapper[4728]: I1216 16:04:05.035829 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7mmv/crc-debug-9t56n" event={"ID":"2d222b9e-2038-4713-bfef-5a70ba3a4d78","Type":"ContainerDied","Data":"223ff96cf6c0af006ee9c6357f5fe9bebb1159814df7f2ffe689d72635801929"} Dec 16 16:04:05 crc kubenswrapper[4728]: I1216 16:04:05.036009 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7mmv/crc-debug-9t56n" event={"ID":"2d222b9e-2038-4713-bfef-5a70ba3a4d78","Type":"ContainerStarted","Data":"2eb5f5901d254296e7ee499223e75d2e4563b034f94596df9d5da32de8e9c4ac"} Dec 16 16:04:05 crc kubenswrapper[4728]: I1216 16:04:05.067568 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g7mmv/crc-debug-9t56n"] Dec 16 16:04:05 crc kubenswrapper[4728]: I1216 16:04:05.081135 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g7mmv/crc-debug-9t56n"] Dec 16 16:04:06 crc kubenswrapper[4728]: I1216 16:04:06.143722 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-9t56n" Dec 16 16:04:06 crc kubenswrapper[4728]: I1216 16:04:06.314930 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8wrx\" (UniqueName: \"kubernetes.io/projected/2d222b9e-2038-4713-bfef-5a70ba3a4d78-kube-api-access-c8wrx\") pod \"2d222b9e-2038-4713-bfef-5a70ba3a4d78\" (UID: \"2d222b9e-2038-4713-bfef-5a70ba3a4d78\") " Dec 16 16:04:06 crc kubenswrapper[4728]: I1216 16:04:06.315445 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d222b9e-2038-4713-bfef-5a70ba3a4d78-host\") pod \"2d222b9e-2038-4713-bfef-5a70ba3a4d78\" (UID: \"2d222b9e-2038-4713-bfef-5a70ba3a4d78\") " Dec 16 16:04:06 crc kubenswrapper[4728]: I1216 16:04:06.316158 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d222b9e-2038-4713-bfef-5a70ba3a4d78-host" (OuterVolumeSpecName: "host") pod "2d222b9e-2038-4713-bfef-5a70ba3a4d78" (UID: "2d222b9e-2038-4713-bfef-5a70ba3a4d78"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 16 16:04:06 crc kubenswrapper[4728]: I1216 16:04:06.320640 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d222b9e-2038-4713-bfef-5a70ba3a4d78-kube-api-access-c8wrx" (OuterVolumeSpecName: "kube-api-access-c8wrx") pod "2d222b9e-2038-4713-bfef-5a70ba3a4d78" (UID: "2d222b9e-2038-4713-bfef-5a70ba3a4d78"). InnerVolumeSpecName "kube-api-access-c8wrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:04:06 crc kubenswrapper[4728]: I1216 16:04:06.417950 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8wrx\" (UniqueName: \"kubernetes.io/projected/2d222b9e-2038-4713-bfef-5a70ba3a4d78-kube-api-access-c8wrx\") on node \"crc\" DevicePath \"\"" Dec 16 16:04:06 crc kubenswrapper[4728]: I1216 16:04:06.418172 4728 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d222b9e-2038-4713-bfef-5a70ba3a4d78-host\") on node \"crc\" DevicePath \"\"" Dec 16 16:04:07 crc kubenswrapper[4728]: I1216 16:04:07.053869 4728 scope.go:117] "RemoveContainer" containerID="223ff96cf6c0af006ee9c6357f5fe9bebb1159814df7f2ffe689d72635801929" Dec 16 16:04:07 crc kubenswrapper[4728]: I1216 16:04:07.053975 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/crc-debug-9t56n" Dec 16 16:04:07 crc kubenswrapper[4728]: I1216 16:04:07.516071 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d222b9e-2038-4713-bfef-5a70ba3a4d78" path="/var/lib/kubelet/pods/2d222b9e-2038-4713-bfef-5a70ba3a4d78/volumes" Dec 16 16:04:16 crc kubenswrapper[4728]: I1216 16:04:16.506621 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:04:16 crc kubenswrapper[4728]: E1216 16:04:16.507357 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:04:27 crc kubenswrapper[4728]: I1216 16:04:27.102529 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-76fcd78578-bhff6_4589b3db-cca9-45d9-a576-71188fd26cd1/barbican-api-log/0.log" Dec 16 16:04:27 crc kubenswrapper[4728]: I1216 16:04:27.124353 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-76fcd78578-bhff6_4589b3db-cca9-45d9-a576-71188fd26cd1/barbican-api/0.log" Dec 16 16:04:27 crc kubenswrapper[4728]: I1216 16:04:27.311294 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8957f9486-cds65_f3dd302c-4cb1-487b-9995-a99059ee9ac6/barbican-keystone-listener/0.log" Dec 16 16:04:27 crc kubenswrapper[4728]: I1216 16:04:27.367140 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8957f9486-cds65_f3dd302c-4cb1-487b-9995-a99059ee9ac6/barbican-keystone-listener-log/0.log" Dec 16 16:04:27 crc kubenswrapper[4728]: I1216 16:04:27.427139 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86cff44659-k2jp2_e3e0ec72-0e84-444e-a66f-50b4fe91adb5/barbican-worker/0.log" Dec 16 16:04:27 crc kubenswrapper[4728]: I1216 16:04:27.514058 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-86cff44659-k2jp2_e3e0ec72-0e84-444e-a66f-50b4fe91adb5/barbican-worker-log/0.log" Dec 16 16:04:27 crc kubenswrapper[4728]: I1216 16:04:27.660839 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8jbxr_801eb0fd-312d-4913-8608-52baf1c65fea/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:27 crc kubenswrapper[4728]: I1216 16:04:27.801962 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0da26377-a29a-4acb-8ef8-4d17c1431d0b/ceilometer-central-agent/0.log" Dec 16 16:04:27 crc kubenswrapper[4728]: I1216 16:04:27.840520 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0da26377-a29a-4acb-8ef8-4d17c1431d0b/ceilometer-notification-agent/0.log" Dec 16 16:04:27 crc kubenswrapper[4728]: I1216 16:04:27.870468 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0da26377-a29a-4acb-8ef8-4d17c1431d0b/proxy-httpd/0.log" Dec 16 16:04:27 crc kubenswrapper[4728]: I1216 16:04:27.914179 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0da26377-a29a-4acb-8ef8-4d17c1431d0b/sg-core/0.log" Dec 16 16:04:28 crc kubenswrapper[4728]: I1216 16:04:28.035241 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c3da261d-5106-45a2-a6c7-d5314450c0af/cinder-api-log/0.log" Dec 16 16:04:28 crc kubenswrapper[4728]: I1216 16:04:28.112668 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c3da261d-5106-45a2-a6c7-d5314450c0af/cinder-api/0.log" Dec 16 16:04:28 crc kubenswrapper[4728]: I1216 16:04:28.288890 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a95d0c5b-fcce-46ba-bfae-1b25bf1d10af/cinder-scheduler/0.log" Dec 16 16:04:28 crc kubenswrapper[4728]: I1216 16:04:28.308788 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-nbgk5_81acd27c-46ac-4132-9e15-6858289dbb7b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:28 crc kubenswrapper[4728]: I1216 16:04:28.312618 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a95d0c5b-fcce-46ba-bfae-1b25bf1d10af/probe/0.log" Dec 16 16:04:28 crc kubenswrapper[4728]: I1216 16:04:28.492888 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-sf5j2_342edac6-5fe9-45d6-9d37-2bc1ed959d23/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:28 crc kubenswrapper[4728]: I1216 16:04:28.574731 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-xsxxz_e4e2028c-f46c-4fd1-8dee-4fb4860de081/init/0.log" Dec 16 16:04:28 crc kubenswrapper[4728]: I1216 16:04:28.743250 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-xsxxz_e4e2028c-f46c-4fd1-8dee-4fb4860de081/init/0.log" Dec 16 16:04:28 crc kubenswrapper[4728]: I1216 16:04:28.795433 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-jwww2_26b6262a-41a3-48c4-aba9-a54801be0a7c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:28 crc kubenswrapper[4728]: I1216 16:04:28.828777 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-xsxxz_e4e2028c-f46c-4fd1-8dee-4fb4860de081/dnsmasq-dns/0.log" Dec 16 16:04:29 crc kubenswrapper[4728]: I1216 16:04:29.000737 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_453173c9-63a1-457e-bf01-dd45f194a238/glance-log/0.log" Dec 16 16:04:29 crc kubenswrapper[4728]: I1216 16:04:29.014597 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_453173c9-63a1-457e-bf01-dd45f194a238/glance-httpd/0.log" Dec 16 16:04:29 crc kubenswrapper[4728]: I1216 16:04:29.189508 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_db21c1bc-6a08-4948-8cea-5d5ee3ecd223/glance-httpd/0.log" Dec 16 16:04:29 crc kubenswrapper[4728]: I1216 16:04:29.218179 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_db21c1bc-6a08-4948-8cea-5d5ee3ecd223/glance-log/0.log" Dec 16 16:04:29 crc kubenswrapper[4728]: I1216 16:04:29.511039 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:04:29 crc kubenswrapper[4728]: E1216 16:04:29.511342 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:04:29 crc kubenswrapper[4728]: I1216 16:04:29.566955 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7585b44dcb-46w99_ac195fba-37cf-48a1-aa91-c9df824ddfe4/horizon/0.log" Dec 16 16:04:29 crc kubenswrapper[4728]: I1216 16:04:29.629136 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-r8jj5_7238debe-2d46-40ca-b598-2011d69c375c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:29 crc kubenswrapper[4728]: I1216 16:04:29.869115 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-gsn6x_6f876743-6860-4e07-b8ed-d1cfcd92f2a7/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:29 crc kubenswrapper[4728]: I1216 16:04:29.889738 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7585b44dcb-46w99_ac195fba-37cf-48a1-aa91-c9df824ddfe4/horizon-log/0.log" Dec 16 16:04:30 crc kubenswrapper[4728]: I1216 16:04:30.100228 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29431681-dm5sf_5d2c351c-1bbc-42b6-ab93-ee6d0e863ed4/keystone-cron/0.log" Dec 16 16:04:30 crc kubenswrapper[4728]: I1216 16:04:30.199549 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5874cbd465-jjmn6_18996006-74fc-4090-941f-783741605f54/keystone-api/0.log" Dec 16 16:04:30 crc kubenswrapper[4728]: I1216 16:04:30.300770 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_dca6bff3-10b0-4969-b7ef-f31cee80091d/kube-state-metrics/0.log" Dec 16 16:04:30 crc kubenswrapper[4728]: I1216 16:04:30.475041 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-fg5md_355982cb-601d-4505-926c-8fa80bd4f3b6/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:30 crc kubenswrapper[4728]: I1216 16:04:30.814281 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-79c9d99cd5-967vg_fcc16f45-2441-47bf-a452-25f78e044a7e/neutron-httpd/0.log" Dec 16 16:04:30 crc kubenswrapper[4728]: I1216 16:04:30.814606 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-79c9d99cd5-967vg_fcc16f45-2441-47bf-a452-25f78e044a7e/neutron-api/0.log" Dec 16 16:04:31 crc kubenswrapper[4728]: I1216 16:04:31.042471 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-rfklt_545f4f42-f672-4cd9-8050-296aa0dd57b8/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:31 crc kubenswrapper[4728]: I1216 16:04:31.533103 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8c311506-90af-4f99-867d-aa1f1b5d2d74/nova-api-log/0.log" Dec 16 16:04:31 crc kubenswrapper[4728]: I1216 16:04:31.597632 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_06dceccb-f462-4eec-b6eb-e7b626c54b66/nova-cell0-conductor-conductor/0.log" Dec 16 16:04:31 crc kubenswrapper[4728]: I1216 16:04:31.894460 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8c311506-90af-4f99-867d-aa1f1b5d2d74/nova-api-api/0.log" Dec 16 16:04:31 crc kubenswrapper[4728]: I1216 16:04:31.906445 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_170e3d88-1e9a-4e6b-aead-ced16b98610e/nova-cell1-conductor-conductor/0.log" Dec 16 16:04:31 crc kubenswrapper[4728]: I1216 16:04:31.978426 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_25eae3b4-b392-4ce4-b16c-5aa9d2cb78fc/nova-cell1-novncproxy-novncproxy/0.log" Dec 16 16:04:32 crc kubenswrapper[4728]: I1216 16:04:32.187170 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-77qmn_655f0b26-df18-45a2-a9f9-24df853d48ed/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:32 crc kubenswrapper[4728]: I1216 16:04:32.258600 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f1f3cd14-f2b0-4fde-a31e-e686b154eb77/nova-metadata-log/0.log" Dec 16 16:04:32 crc kubenswrapper[4728]: I1216 16:04:32.575250 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76f2644a-8bb9-4719-83dd-429202a52446/mysql-bootstrap/0.log" Dec 16 16:04:32 crc kubenswrapper[4728]: I1216 16:04:32.620997 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ba3dbe8b-ebcb-47f8-8f81-924aec84c326/nova-scheduler-scheduler/0.log" Dec 16 16:04:32 crc kubenswrapper[4728]: I1216 16:04:32.785189 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76f2644a-8bb9-4719-83dd-429202a52446/mysql-bootstrap/0.log" Dec 16 16:04:32 crc kubenswrapper[4728]: I1216 16:04:32.791737 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_76f2644a-8bb9-4719-83dd-429202a52446/galera/0.log" Dec 16 16:04:33 crc kubenswrapper[4728]: I1216 16:04:33.018105 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_eb629e93-c552-47c3-8c89-11254ffa834f/mysql-bootstrap/0.log" Dec 16 16:04:33 crc kubenswrapper[4728]: I1216 16:04:33.191802 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_eb629e93-c552-47c3-8c89-11254ffa834f/galera/0.log" Dec 16 16:04:33 crc kubenswrapper[4728]: I1216 16:04:33.245669 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_eb629e93-c552-47c3-8c89-11254ffa834f/mysql-bootstrap/0.log" Dec 16 16:04:33 crc kubenswrapper[4728]: I1216 16:04:33.372080 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_09f99482-afc8-48dd-95a3-ada07d611db1/openstackclient/0.log" Dec 16 16:04:33 crc kubenswrapper[4728]: I1216 16:04:33.522355 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hlkkv_37c82b8b-fe2d-4265-80b1-7cdfa00e2be7/ovn-controller/0.log" Dec 16 16:04:33 crc kubenswrapper[4728]: I1216 16:04:33.626104 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f1f3cd14-f2b0-4fde-a31e-e686b154eb77/nova-metadata-metadata/0.log" Dec 16 16:04:33 crc kubenswrapper[4728]: I1216 16:04:33.728621 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ccc6t_effa7d99-cccc-431b-91b6-d4302f7dce22/openstack-network-exporter/0.log" Dec 16 16:04:33 crc kubenswrapper[4728]: I1216 16:04:33.853004 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4m68_865baf70-58f9-4eee-8cf4-d5e96e6d011e/ovsdb-server-init/0.log" Dec 16 16:04:34 crc kubenswrapper[4728]: I1216 16:04:34.090740 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4m68_865baf70-58f9-4eee-8cf4-d5e96e6d011e/ovsdb-server/0.log" Dec 16 16:04:34 crc kubenswrapper[4728]: I1216 16:04:34.121265 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4m68_865baf70-58f9-4eee-8cf4-d5e96e6d011e/ovs-vswitchd/0.log" Dec 16 16:04:34 crc kubenswrapper[4728]: I1216 16:04:34.139448 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4m68_865baf70-58f9-4eee-8cf4-d5e96e6d011e/ovsdb-server-init/0.log" Dec 16 16:04:34 crc kubenswrapper[4728]: I1216 16:04:34.371990 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d390ca4f-5aa2-45e8-a08e-b2e86218e36f/openstack-network-exporter/0.log" Dec 16 16:04:34 crc kubenswrapper[4728]: I1216 16:04:34.401937 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xxw58_2767eeb4-bf6d-4381-8277-c6d99cad99a5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:34 crc kubenswrapper[4728]: I1216 16:04:34.430910 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d390ca4f-5aa2-45e8-a08e-b2e86218e36f/ovn-northd/0.log" Dec 16 16:04:34 crc kubenswrapper[4728]: I1216 16:04:34.579634 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b2809df7-1873-474c-ab44-14b82f630cb0/openstack-network-exporter/0.log" Dec 16 16:04:34 crc kubenswrapper[4728]: I1216 16:04:34.683116 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b2809df7-1873-474c-ab44-14b82f630cb0/ovsdbserver-nb/0.log" Dec 16 16:04:34 crc kubenswrapper[4728]: I1216 16:04:34.768519 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d587bd5e-c0c9-48f1-a2b6-616e904ceed3/openstack-network-exporter/0.log" Dec 16 16:04:34 crc kubenswrapper[4728]: I1216 16:04:34.855646 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d587bd5e-c0c9-48f1-a2b6-616e904ceed3/ovsdbserver-sb/0.log" Dec 16 16:04:35 crc kubenswrapper[4728]: I1216 16:04:35.410145 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7dcd7544cd-gnxgg_7ac43e45-8d37-4ab4-9ebe-441421fe9044/placement-api/0.log" Dec 16 16:04:35 crc kubenswrapper[4728]: I1216 16:04:35.530294 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7dcd7544cd-gnxgg_7ac43e45-8d37-4ab4-9ebe-441421fe9044/placement-log/0.log" Dec 16 16:04:35 crc kubenswrapper[4728]: I1216 16:04:35.609812 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e64ff4ca-1141-477e-8db1-b2068e3b6d9a/setup-container/0.log" Dec 16 16:04:35 crc kubenswrapper[4728]: I1216 16:04:35.787550 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e64ff4ca-1141-477e-8db1-b2068e3b6d9a/setup-container/0.log" Dec 16 16:04:35 crc kubenswrapper[4728]: I1216 16:04:35.879078 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e19aee19-231d-4847-9e7e-78b8745576ae/setup-container/0.log" Dec 16 16:04:35 crc kubenswrapper[4728]: I1216 16:04:35.894551 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e64ff4ca-1141-477e-8db1-b2068e3b6d9a/rabbitmq/0.log" Dec 16 16:04:36 crc kubenswrapper[4728]: I1216 16:04:36.127307 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e19aee19-231d-4847-9e7e-78b8745576ae/setup-container/0.log" Dec 16 16:04:36 crc kubenswrapper[4728]: I1216 16:04:36.179597 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e19aee19-231d-4847-9e7e-78b8745576ae/rabbitmq/0.log" Dec 16 16:04:36 crc kubenswrapper[4728]: I1216 16:04:36.189608 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xlk79_8154d34c-28e4-4d89-a271-f1b2fb4daa29/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:36 crc kubenswrapper[4728]: I1216 16:04:36.348264 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fbn4q_447d1f35-7fe1-4655-8893-3ca4afed13d6/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:36 crc kubenswrapper[4728]: I1216 16:04:36.411277 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-w9xrx_37e0ae2a-b0ba-45a7-9395-1af1365adf86/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.079934 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-qpfkk_6270b5fc-f711-41e7-b66c-1cac1f2f3b43/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.162357 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9gcfz_a9b27bc6-f730-4cc8-a626-de82d2c022b8/ssh-known-hosts-edpm-deployment/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.402069 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-54bb7475-hxsvl_b5b59721-592a-4649-8246-0487a18177b9/proxy-server/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.425791 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-54bb7475-hxsvl_b5b59721-592a-4649-8246-0487a18177b9/proxy-httpd/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.480098 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vvql8_55ebd6bb-cac2-4b8f-932d-46662c011b18/swift-ring-rebalance/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.622647 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/account-reaper/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.643698 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/account-auditor/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.721711 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/account-replicator/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.810900 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/account-server/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.837218 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/container-auditor/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.909014 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/container-replicator/0.log" Dec 16 16:04:37 crc kubenswrapper[4728]: I1216 16:04:37.954209 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/container-server/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.005972 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/container-updater/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.051148 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/object-auditor/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.162899 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/object-replicator/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.200739 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/object-expirer/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.207919 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/object-server/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.264206 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/object-updater/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.400329 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/rsync/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.400473 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fc3761f8-7e22-45e1-8119-a40338b80f1d/swift-recon-cron/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.618256 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-sbhzt_e3adc58c-a09d-4e32-bd59-10d32f1866ca/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.628625 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_78bce531-8ad9-43f3-9d5a-2edaf2df712f/tempest-tests-tempest-tests-runner/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.836025 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_da534e2d-cb12-451d-b5bf-16b7943c82bb/test-operator-logs-container/0.log" Dec 16 16:04:38 crc kubenswrapper[4728]: I1216 16:04:38.891676 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-8n5ns_b642fac2-b01e-4ec8-80dc-3193414e335c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 16 16:04:43 crc kubenswrapper[4728]: I1216 16:04:43.507383 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:04:43 crc kubenswrapper[4728]: E1216 16:04:43.508242 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:04:48 crc kubenswrapper[4728]: I1216 16:04:48.895218 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8cf2b12c-4959-429e-b9db-173f5ddfab90/memcached/0.log" Dec 16 16:04:56 crc kubenswrapper[4728]: I1216 16:04:56.506012 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:04:56 crc kubenswrapper[4728]: E1216 16:04:56.506703 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:05:06 crc kubenswrapper[4728]: I1216 16:05:06.156845 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-95949466-l6vt8_12db8b96-5f9f-4d46-9dbe-71b1e1d5c82c/manager/0.log" Dec 16 16:05:06 crc kubenswrapper[4728]: I1216 16:05:06.342038 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-6sdq7_dbf95255-3fe3-4421-be60-212514fef21c/manager/0.log" Dec 16 16:05:06 crc kubenswrapper[4728]: I1216 16:05:06.372025 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-qtz4v_84531a1b-f019-449d-8779-05b03bde07cb/manager/0.log" Dec 16 16:05:06 crc kubenswrapper[4728]: I1216 16:05:06.516838 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/util/0.log" Dec 16 16:05:06 crc kubenswrapper[4728]: I1216 16:05:06.759717 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/pull/0.log" Dec 16 16:05:06 crc kubenswrapper[4728]: I1216 16:05:06.787348 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/util/0.log" Dec 16 16:05:06 crc kubenswrapper[4728]: I1216 16:05:06.793193 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/pull/0.log" Dec 16 16:05:06 crc kubenswrapper[4728]: I1216 16:05:06.949976 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/util/0.log" Dec 16 16:05:06 crc kubenswrapper[4728]: I1216 16:05:06.972401 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/pull/0.log" Dec 16 16:05:07 crc kubenswrapper[4728]: I1216 16:05:07.040970 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9c8f91844781ae1ae76808c56ff8ffabf80fa647f88f276a73084ec1e8vqsp_6e466ee1-8f55-4662-932a-fb92d7d03f5e/extract/0.log" Dec 16 16:05:07 crc kubenswrapper[4728]: I1216 16:05:07.208095 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-767f9d7567-wn6qf_f5364dc6-650d-427d-aab6-c50ba3d69b75/manager/0.log" Dec 16 16:05:07 crc kubenswrapper[4728]: I1216 16:05:07.209853 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-59b8dcb766-qpsk9_90e228b6-e35d-4ee2-992c-364b4abd8436/manager/0.log" Dec 16 16:05:07 crc kubenswrapper[4728]: I1216 16:05:07.392976 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccf486b9-hcdxf_0252b186-dc46-4cca-ba92-9855cb2aa4ec/manager/0.log" Dec 16 16:05:07 crc kubenswrapper[4728]: I1216 16:05:07.507269 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:05:07 crc kubenswrapper[4728]: E1216 16:05:07.507551 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:05:07 crc kubenswrapper[4728]: I1216 16:05:07.650536 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-ttkv5_660d7a4f-e56a-42c8-8db6-d1f7285d7d04/manager/0.log" Dec 16 16:05:07 crc kubenswrapper[4728]: I1216 16:05:07.679489 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84b495f78-ljkxp_a8ceccb7-c74c-42c4-a763-d947892f942d/manager/0.log" Dec 16 16:05:07 crc kubenswrapper[4728]: I1216 16:05:07.748283 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5c7cbf548f-mns5x_0cc3d254-9633-4e63-91a8-719af70696f6/manager/0.log" Dec 16 16:05:07 crc kubenswrapper[4728]: I1216 16:05:07.877074 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fdd9786f7-mfc2h_59a84980-fdf4-4ff3-b8c7-464e1423bad3/manager/0.log" Dec 16 16:05:07 crc kubenswrapper[4728]: I1216 16:05:07.967197 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f76f4954c-t6vdg_89d4ec07-baef-4061-b6d8-e50f3ab47bb1/manager/0.log" Dec 16 16:05:08 crc kubenswrapper[4728]: I1216 16:05:08.094415 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-xmv9j_fe17017f-5157-4d72-bb40-58a456517c3e/manager/0.log" Dec 16 16:05:08 crc kubenswrapper[4728]: I1216 16:05:08.246655 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-4jbw4_6b5beb20-1139-4774-8ea6-b5c951a6cbba/manager/0.log" Dec 16 16:05:08 crc kubenswrapper[4728]: I1216 16:05:08.301859 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-j7jxc_e501f8ed-3791-4661-8c3e-bfb4eaeeb64d/manager/0.log" Dec 16 16:05:08 crc kubenswrapper[4728]: I1216 16:05:08.400224 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7b67c7f6c5qxdcl_a4b04d21-7de1-4565-99e6-fbeb59a0fde6/manager/0.log" Dec 16 16:05:09 crc kubenswrapper[4728]: I1216 16:05:09.012002 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6bdf96f7b8-fqbkd_d6a45f52-4776-491e-a850-afe8d2efa914/operator/0.log" Dec 16 16:05:09 crc kubenswrapper[4728]: I1216 16:05:09.104937 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-9hjx2_3818a60e-feb9-4ae0-a15a-48c59870b921/registry-server/0.log" Dec 16 16:05:09 crc kubenswrapper[4728]: I1216 16:05:09.330693 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-68vvq_7a8c4b97-2de8-4235-aa76-c8382c5c5cb1/manager/0.log" Dec 16 16:05:09 crc kubenswrapper[4728]: I1216 16:05:09.579586 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-f9pgp_160c8222-a7a2-4f58-bbe1-c6a5d4b6b38e/operator/0.log" Dec 16 16:05:09 crc kubenswrapper[4728]: I1216 16:05:09.596140 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8665b56d78-zcz8p_75c9a0f4-94bc-4bf5-b164-149256d1a214/manager/0.log" Dec 16 16:05:09 crc kubenswrapper[4728]: I1216 16:05:09.649902 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-757cf4457b-v8kt9_0def48bf-646d-4641-93b5-a9e4e058cc67/manager/0.log" Dec 16 16:05:09 crc kubenswrapper[4728]: I1216 16:05:09.741651 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c6df8f9-xvvjw_9ca3d3f7-7e18-4c96-9071-1cd82d2b2ee4/manager/0.log" Dec 16 16:05:09 crc kubenswrapper[4728]: I1216 16:05:09.903101 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-97d456b9-n6x46_f155db6c-255a-4401-884a-b48825bb93c7/manager/0.log" Dec 16 16:05:09 crc kubenswrapper[4728]: I1216 16:05:09.942610 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-9p2mz_8324ae5e-23f8-4267-9822-a4ae37c7cd5a/manager/0.log" Dec 16 16:05:10 crc kubenswrapper[4728]: I1216 16:05:10.093784 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55f78b7c4c-dz6x8_66c25f6d-85c4-4e3e-bf44-93499cc2321c/manager/0.log" Dec 16 16:05:19 crc kubenswrapper[4728]: I1216 16:05:19.512183 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:05:19 crc kubenswrapper[4728]: E1216 16:05:19.513041 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:05:30 crc kubenswrapper[4728]: I1216 16:05:30.506914 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:05:30 crc kubenswrapper[4728]: E1216 16:05:30.508070 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:05:32 crc kubenswrapper[4728]: I1216 16:05:32.254660 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zf5xv_14a40ff4-9558-428f-a784-c18c5d62d60a/control-plane-machine-set-operator/0.log" Dec 16 16:05:32 crc kubenswrapper[4728]: I1216 16:05:32.419120 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pfz7w_6ef09dcb-9a41-4fb0-8492-cdd81b0222fe/machine-api-operator/0.log" Dec 16 16:05:32 crc kubenswrapper[4728]: I1216 16:05:32.444263 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pfz7w_6ef09dcb-9a41-4fb0-8492-cdd81b0222fe/kube-rbac-proxy/0.log" Dec 16 16:05:42 crc kubenswrapper[4728]: I1216 16:05:42.506645 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:05:42 crc kubenswrapper[4728]: E1216 16:05:42.507481 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:05:46 crc kubenswrapper[4728]: I1216 16:05:46.296920 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-xlhf9_98db182b-146e-48eb-918d-ff62909f62de/cert-manager-controller/0.log" Dec 16 16:05:46 crc kubenswrapper[4728]: I1216 16:05:46.481519 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qtpbj_86fdf4d9-bff1-40f5-b1f7-7d74536c7f39/cert-manager-cainjector/0.log" Dec 16 16:05:46 crc kubenswrapper[4728]: I1216 16:05:46.548693 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9x457_14b59c49-2ca7-4fd1-96a7-926474663fc8/cert-manager-webhook/0.log" Dec 16 16:05:57 crc kubenswrapper[4728]: I1216 16:05:57.506808 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:05:57 crc kubenswrapper[4728]: E1216 16:05:57.507516 4728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-njzmx_openshift-machine-config-operator(d5cdc17e-067e-4d74-b768-02966221d3ae)\"" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" Dec 16 16:05:59 crc kubenswrapper[4728]: I1216 16:05:59.711708 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-zzbs8_84209333-74b0-4804-ac6e-e829f0ec1bc7/nmstate-console-plugin/0.log" Dec 16 16:05:59 crc kubenswrapper[4728]: I1216 16:05:59.910287 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hd8rz_d61cf9e1-67c0-4258-af87-e4244df3c68e/nmstate-handler/0.log" Dec 16 16:05:59 crc kubenswrapper[4728]: I1216 16:05:59.933875 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-fkppc_22a101d0-c77f-42c4-88e7-ff7bfb0c204d/kube-rbac-proxy/0.log" Dec 16 16:05:59 crc kubenswrapper[4728]: I1216 16:05:59.985214 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-fkppc_22a101d0-c77f-42c4-88e7-ff7bfb0c204d/nmstate-metrics/0.log" Dec 16 16:06:00 crc kubenswrapper[4728]: I1216 16:06:00.088774 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-9mppg_fdf13fea-12cb-4713-bae2-3cabd3aae756/nmstate-operator/0.log" Dec 16 16:06:00 crc kubenswrapper[4728]: I1216 16:06:00.175049 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-r6qf9_f35491e6-33aa-4c1d-a9c0-1b95f43ad54f/nmstate-webhook/0.log" Dec 16 16:06:10 crc kubenswrapper[4728]: I1216 16:06:10.506817 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:06:11 crc kubenswrapper[4728]: I1216 16:06:11.173690 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"d42f638dc3af8bacb277ff5376bc18296ad870667edf5b251b48b95a53881011"} Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.166496 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z4f7p"] Dec 16 16:06:14 crc kubenswrapper[4728]: E1216 16:06:14.168971 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d222b9e-2038-4713-bfef-5a70ba3a4d78" containerName="container-00" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.169002 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d222b9e-2038-4713-bfef-5a70ba3a4d78" containerName="container-00" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.169240 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d222b9e-2038-4713-bfef-5a70ba3a4d78" containerName="container-00" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.171597 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.188481 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4f7p"] Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.193012 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9-catalog-content\") pod \"certified-operators-z4f7p\" (UID: \"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9\") " pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.193059 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9-utilities\") pod \"certified-operators-z4f7p\" (UID: \"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9\") " pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.193126 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzdf7\" (UniqueName: \"kubernetes.io/projected/cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9-kube-api-access-pzdf7\") pod \"certified-operators-z4f7p\" (UID: \"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9\") " pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.295093 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzdf7\" (UniqueName: \"kubernetes.io/projected/cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9-kube-api-access-pzdf7\") pod \"certified-operators-z4f7p\" (UID: \"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9\") " pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.295263 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9-catalog-content\") pod \"certified-operators-z4f7p\" (UID: \"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9\") " pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.295363 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9-utilities\") pod \"certified-operators-z4f7p\" (UID: \"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9\") " pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.295898 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9-catalog-content\") pod \"certified-operators-z4f7p\" (UID: \"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9\") " pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.295928 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9-utilities\") pod \"certified-operators-z4f7p\" (UID: \"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9\") " pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.315796 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzdf7\" (UniqueName: \"kubernetes.io/projected/cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9-kube-api-access-pzdf7\") pod \"certified-operators-z4f7p\" (UID: \"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9\") " pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:14 crc kubenswrapper[4728]: I1216 16:06:14.500964 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:15 crc kubenswrapper[4728]: I1216 16:06:15.062090 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4f7p"] Dec 16 16:06:15 crc kubenswrapper[4728]: W1216 16:06:15.067543 4728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf8e0dfc_68ba_4f9c_9c98_8078e3cc9ec9.slice/crio-f809d6e48aa069a437413f3e7ffcceba0725ceae2530de98f604d989c9435bfa WatchSource:0}: Error finding container f809d6e48aa069a437413f3e7ffcceba0725ceae2530de98f604d989c9435bfa: Status 404 returned error can't find the container with id f809d6e48aa069a437413f3e7ffcceba0725ceae2530de98f604d989c9435bfa Dec 16 16:06:15 crc kubenswrapper[4728]: I1216 16:06:15.204783 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4f7p" event={"ID":"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9","Type":"ContainerStarted","Data":"f809d6e48aa069a437413f3e7ffcceba0725ceae2530de98f604d989c9435bfa"} Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.184846 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-x5g2t_bd55b5d2-c827-4b76-bd1e-e1c033737650/kube-rbac-proxy/0.log" Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.232958 4728 generic.go:334] "Generic (PLEG): container finished" podID="cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9" containerID="ae76756fccc64fb5ee064d9be182001be005be0bbce49a4d791c45bf7a0903c3" exitCode=0 Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.232999 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4f7p" event={"ID":"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9","Type":"ContainerDied","Data":"ae76756fccc64fb5ee064d9be182001be005be0bbce49a4d791c45bf7a0903c3"} Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.236329 4728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.287265 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-x5g2t_bd55b5d2-c827-4b76-bd1e-e1c033737650/controller/0.log" Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.449490 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-frr-files/0.log" Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.595651 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-metrics/0.log" Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.598837 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-reloader/0.log" Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.615919 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-frr-files/0.log" Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.652361 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-reloader/0.log" Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.881915 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-metrics/0.log" Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.925836 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-reloader/0.log" Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.948248 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-metrics/0.log" Dec 16 16:06:16 crc kubenswrapper[4728]: I1216 16:06:16.969274 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-frr-files/0.log" Dec 16 16:06:17 crc kubenswrapper[4728]: I1216 16:06:17.640140 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-frr-files/0.log" Dec 16 16:06:17 crc kubenswrapper[4728]: I1216 16:06:17.647574 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-reloader/0.log" Dec 16 16:06:17 crc kubenswrapper[4728]: I1216 16:06:17.777841 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/cp-metrics/0.log" Dec 16 16:06:17 crc kubenswrapper[4728]: I1216 16:06:17.814442 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/controller/0.log" Dec 16 16:06:17 crc kubenswrapper[4728]: I1216 16:06:17.846814 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/frr-metrics/0.log" Dec 16 16:06:18 crc kubenswrapper[4728]: I1216 16:06:18.044388 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/kube-rbac-proxy/0.log" Dec 16 16:06:18 crc kubenswrapper[4728]: I1216 16:06:18.083249 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/kube-rbac-proxy-frr/0.log" Dec 16 16:06:18 crc kubenswrapper[4728]: I1216 16:06:18.102217 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/reloader/0.log" Dec 16 16:06:18 crc kubenswrapper[4728]: I1216 16:06:18.266949 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-5w4gq_afa56798-790e-42c2-98af-9e0f7313603c/frr-k8s-webhook-server/0.log" Dec 16 16:06:18 crc kubenswrapper[4728]: I1216 16:06:18.500730 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-fd5945654-clj75_d1b1e578-b0a6-446b-90d1-7df5d4d4a43a/manager/0.log" Dec 16 16:06:18 crc kubenswrapper[4728]: I1216 16:06:18.551169 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6dfbdf4c69-n5ksx_55c8e87a-d9fe-4f1c-af42-0dee2e0f3fd9/webhook-server/0.log" Dec 16 16:06:18 crc kubenswrapper[4728]: I1216 16:06:18.757394 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-872z5_0c9a8885-9664-4048-bce4-8fc1cab033d8/kube-rbac-proxy/0.log" Dec 16 16:06:19 crc kubenswrapper[4728]: I1216 16:06:19.290657 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-872z5_0c9a8885-9664-4048-bce4-8fc1cab033d8/speaker/0.log" Dec 16 16:06:19 crc kubenswrapper[4728]: I1216 16:06:19.448093 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vwbfc_129197cb-b920-4ccb-870a-b3b7aabc5928/frr/0.log" Dec 16 16:06:24 crc kubenswrapper[4728]: I1216 16:06:24.321914 4728 generic.go:334] "Generic (PLEG): container finished" podID="cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9" containerID="c0b8ad1b91fa33ed7f1026384643e9b6e3152f983505ecc32f31e7511938a508" exitCode=0 Dec 16 16:06:24 crc kubenswrapper[4728]: I1216 16:06:24.322044 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4f7p" event={"ID":"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9","Type":"ContainerDied","Data":"c0b8ad1b91fa33ed7f1026384643e9b6e3152f983505ecc32f31e7511938a508"} Dec 16 16:06:25 crc kubenswrapper[4728]: I1216 16:06:25.336529 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4f7p" event={"ID":"cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9","Type":"ContainerStarted","Data":"00c0566d147dcbdb988b4c14d5e58fd116dedf8c56e6950e7f6225b7c718589c"} Dec 16 16:06:25 crc kubenswrapper[4728]: I1216 16:06:25.361784 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z4f7p" podStartSLOduration=2.529721191 podStartE2EDuration="11.36176232s" podCreationTimestamp="2025-12-16 16:06:14 +0000 UTC" firstStartedPulling="2025-12-16 16:06:16.236076039 +0000 UTC m=+4157.076255023" lastFinishedPulling="2025-12-16 16:06:25.068117178 +0000 UTC m=+4165.908296152" observedRunningTime="2025-12-16 16:06:25.354143194 +0000 UTC m=+4166.194322208" watchObservedRunningTime="2025-12-16 16:06:25.36176232 +0000 UTC m=+4166.201941314" Dec 16 16:06:28 crc kubenswrapper[4728]: I1216 16:06:28.787815 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jlhdb"] Dec 16 16:06:28 crc kubenswrapper[4728]: I1216 16:06:28.790739 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:28 crc kubenswrapper[4728]: I1216 16:06:28.814775 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlhdb"] Dec 16 16:06:28 crc kubenswrapper[4728]: I1216 16:06:28.885254 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-catalog-content\") pod \"redhat-marketplace-jlhdb\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:28 crc kubenswrapper[4728]: I1216 16:06:28.885323 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7xgx\" (UniqueName: \"kubernetes.io/projected/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-kube-api-access-n7xgx\") pod \"redhat-marketplace-jlhdb\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:28 crc kubenswrapper[4728]: I1216 16:06:28.885357 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-utilities\") pod \"redhat-marketplace-jlhdb\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:28 crc kubenswrapper[4728]: I1216 16:06:28.986556 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-catalog-content\") pod \"redhat-marketplace-jlhdb\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:28 crc kubenswrapper[4728]: I1216 16:06:28.987212 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7xgx\" (UniqueName: \"kubernetes.io/projected/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-kube-api-access-n7xgx\") pod \"redhat-marketplace-jlhdb\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:28 crc kubenswrapper[4728]: I1216 16:06:28.987613 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-utilities\") pod \"redhat-marketplace-jlhdb\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:28 crc kubenswrapper[4728]: I1216 16:06:28.988094 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-utilities\") pod \"redhat-marketplace-jlhdb\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:28 crc kubenswrapper[4728]: I1216 16:06:28.987074 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-catalog-content\") pod \"redhat-marketplace-jlhdb\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:29 crc kubenswrapper[4728]: I1216 16:06:29.006896 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7xgx\" (UniqueName: \"kubernetes.io/projected/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-kube-api-access-n7xgx\") pod \"redhat-marketplace-jlhdb\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:29 crc kubenswrapper[4728]: I1216 16:06:29.116486 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:29 crc kubenswrapper[4728]: I1216 16:06:29.891083 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlhdb"] Dec 16 16:06:30 crc kubenswrapper[4728]: I1216 16:06:30.380722 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlhdb" event={"ID":"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd","Type":"ContainerStarted","Data":"ab92706403d5af422496f8cf2cd18e557402ee253820ab733987f6f6f295b6f0"} Dec 16 16:06:31 crc kubenswrapper[4728]: I1216 16:06:31.390738 4728 generic.go:334] "Generic (PLEG): container finished" podID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" containerID="4fac2163ef6d846107ad27349e41ffc6121fc84f0a420033b7cadfa84ab34d1e" exitCode=0 Dec 16 16:06:31 crc kubenswrapper[4728]: I1216 16:06:31.390828 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlhdb" event={"ID":"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd","Type":"ContainerDied","Data":"4fac2163ef6d846107ad27349e41ffc6121fc84f0a420033b7cadfa84ab34d1e"} Dec 16 16:06:32 crc kubenswrapper[4728]: I1216 16:06:32.400960 4728 generic.go:334] "Generic (PLEG): container finished" podID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" containerID="b3f71c6e6a2197290b232fa1dcb27561e36a5362d89da2249b63da7988a86412" exitCode=0 Dec 16 16:06:32 crc kubenswrapper[4728]: I1216 16:06:32.401060 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlhdb" event={"ID":"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd","Type":"ContainerDied","Data":"b3f71c6e6a2197290b232fa1dcb27561e36a5362d89da2249b63da7988a86412"} Dec 16 16:06:33 crc kubenswrapper[4728]: I1216 16:06:33.414330 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlhdb" event={"ID":"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd","Type":"ContainerStarted","Data":"286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3"} Dec 16 16:06:33 crc kubenswrapper[4728]: I1216 16:06:33.439051 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jlhdb" podStartSLOduration=3.999774283 podStartE2EDuration="5.439031511s" podCreationTimestamp="2025-12-16 16:06:28 +0000 UTC" firstStartedPulling="2025-12-16 16:06:31.392839528 +0000 UTC m=+4172.233018512" lastFinishedPulling="2025-12-16 16:06:32.832096766 +0000 UTC m=+4173.672275740" observedRunningTime="2025-12-16 16:06:33.43050655 +0000 UTC m=+4174.270685554" watchObservedRunningTime="2025-12-16 16:06:33.439031511 +0000 UTC m=+4174.279210495" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.010667 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/util/0.log" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.264278 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/pull/0.log" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.267448 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/util/0.log" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.277122 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/pull/0.log" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.480927 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/extract/0.log" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.501757 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.502008 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.555910 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.643307 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/pull/0.log" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.668532 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4fwzvk_7fda98d5-5127-49d8-a054-ece045552e27/util/0.log" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.714695 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/util/0.log" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.874765 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/pull/0.log" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.955565 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/pull/0.log" Dec 16 16:06:34 crc kubenswrapper[4728]: I1216 16:06:34.958600 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/util/0.log" Dec 16 16:06:35 crc kubenswrapper[4728]: I1216 16:06:35.180175 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/pull/0.log" Dec 16 16:06:35 crc kubenswrapper[4728]: I1216 16:06:35.183477 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/util/0.log" Dec 16 16:06:35 crc kubenswrapper[4728]: I1216 16:06:35.264724 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8dwd8m_080ce000-f07f-47b8-ad87-0dd66e7a6fba/extract/0.log" Dec 16 16:06:35 crc kubenswrapper[4728]: I1216 16:06:35.367782 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-utilities/0.log" Dec 16 16:06:35 crc kubenswrapper[4728]: I1216 16:06:35.493128 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z4f7p" Dec 16 16:06:35 crc kubenswrapper[4728]: I1216 16:06:35.536170 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-utilities/0.log" Dec 16 16:06:35 crc kubenswrapper[4728]: I1216 16:06:35.546209 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-content/0.log" Dec 16 16:06:35 crc kubenswrapper[4728]: I1216 16:06:35.551091 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-content/0.log" Dec 16 16:06:35 crc kubenswrapper[4728]: I1216 16:06:35.752169 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-utilities/0.log" Dec 16 16:06:35 crc kubenswrapper[4728]: I1216 16:06:35.768676 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/extract-content/0.log" Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.006152 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z4f7p_cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9/extract-utilities/0.log" Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.252734 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z4f7p_cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9/extract-utilities/0.log" Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.332022 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z4f7p_cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9/extract-content/0.log" Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.347722 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z4f7p_cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9/extract-content/0.log" Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.489904 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pln52_12774c70-805e-47d0-9c1f-e0b59a4f9d06/registry-server/0.log" Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.572670 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z4f7p_cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9/extract-utilities/0.log" Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.596795 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4f7p"] Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.630695 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z4f7p_cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9/extract-content/0.log" Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.656501 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z4f7p_cf8e0dfc-68ba-4f9c-9c98-8078e3cc9ec9/registry-server/0.log" Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.767373 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-utilities/0.log" Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.975635 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pln52"] Dec 16 16:06:36 crc kubenswrapper[4728]: I1216 16:06:36.975898 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pln52" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" containerName="registry-server" containerID="cri-o://da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac" gracePeriod=2 Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.025611 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-utilities/0.log" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.041948 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-content/0.log" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.091363 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-content/0.log" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.315334 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-content/0.log" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.328822 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/extract-utilities/0.log" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.394905 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pln52" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.453526 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-utilities\") pod \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.453584 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-catalog-content\") pod \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.453629 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkkbm\" (UniqueName: \"kubernetes.io/projected/12774c70-805e-47d0-9c1f-e0b59a4f9d06-kube-api-access-qkkbm\") pod \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\" (UID: \"12774c70-805e-47d0-9c1f-e0b59a4f9d06\") " Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.458383 4728 generic.go:334] "Generic (PLEG): container finished" podID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" containerID="da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac" exitCode=0 Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.458543 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pln52" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.458562 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pln52" event={"ID":"12774c70-805e-47d0-9c1f-e0b59a4f9d06","Type":"ContainerDied","Data":"da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac"} Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.458951 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pln52" event={"ID":"12774c70-805e-47d0-9c1f-e0b59a4f9d06","Type":"ContainerDied","Data":"9d9f6bc56e9b9cf80a210b06584d30c7c5de1c6abb9e44bb7499d0a27a718d0f"} Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.459055 4728 scope.go:117] "RemoveContainer" containerID="da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.462516 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-utilities" (OuterVolumeSpecName: "utilities") pod "12774c70-805e-47d0-9c1f-e0b59a4f9d06" (UID: "12774c70-805e-47d0-9c1f-e0b59a4f9d06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.465596 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12774c70-805e-47d0-9c1f-e0b59a4f9d06-kube-api-access-qkkbm" (OuterVolumeSpecName: "kube-api-access-qkkbm") pod "12774c70-805e-47d0-9c1f-e0b59a4f9d06" (UID: "12774c70-805e-47d0-9c1f-e0b59a4f9d06"). InnerVolumeSpecName "kube-api-access-qkkbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.533370 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12774c70-805e-47d0-9c1f-e0b59a4f9d06" (UID: "12774c70-805e-47d0-9c1f-e0b59a4f9d06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.560650 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.560690 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkkbm\" (UniqueName: \"kubernetes.io/projected/12774c70-805e-47d0-9c1f-e0b59a4f9d06-kube-api-access-qkkbm\") on node \"crc\" DevicePath \"\"" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.560704 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12774c70-805e-47d0-9c1f-e0b59a4f9d06-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.563269 4728 scope.go:117] "RemoveContainer" containerID="e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.613606 4728 scope.go:117] "RemoveContainer" containerID="6d18d29d0e0431f38ad8b1f897bd68de792ecf1d07f9631b0fbe2a120ed22d18" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.614807 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-dhkkk_28557b66-a02a-4c9e-880f-3d9f21e5892b/marketplace-operator/0.log" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.662971 4728 scope.go:117] "RemoveContainer" containerID="da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac" Dec 16 16:06:37 crc kubenswrapper[4728]: E1216 16:06:37.663971 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac\": container with ID starting with da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac not found: ID does not exist" containerID="da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.664005 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac"} err="failed to get container status \"da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac\": rpc error: code = NotFound desc = could not find container \"da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac\": container with ID starting with da05f87c8dc98e6ab99352b73769a6d93c09e183ab6dd3280415e440c01672ac not found: ID does not exist" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.664025 4728 scope.go:117] "RemoveContainer" containerID="e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95" Dec 16 16:06:37 crc kubenswrapper[4728]: E1216 16:06:37.664601 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95\": container with ID starting with e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95 not found: ID does not exist" containerID="e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.664629 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95"} err="failed to get container status \"e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95\": rpc error: code = NotFound desc = could not find container \"e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95\": container with ID starting with e73c544e446ae88a5892635b6280033f4064a3f9a6e2e655f2312be51db36e95 not found: ID does not exist" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.664643 4728 scope.go:117] "RemoveContainer" containerID="6d18d29d0e0431f38ad8b1f897bd68de792ecf1d07f9631b0fbe2a120ed22d18" Dec 16 16:06:37 crc kubenswrapper[4728]: E1216 16:06:37.665063 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d18d29d0e0431f38ad8b1f897bd68de792ecf1d07f9631b0fbe2a120ed22d18\": container with ID starting with 6d18d29d0e0431f38ad8b1f897bd68de792ecf1d07f9631b0fbe2a120ed22d18 not found: ID does not exist" containerID="6d18d29d0e0431f38ad8b1f897bd68de792ecf1d07f9631b0fbe2a120ed22d18" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.665084 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d18d29d0e0431f38ad8b1f897bd68de792ecf1d07f9631b0fbe2a120ed22d18"} err="failed to get container status \"6d18d29d0e0431f38ad8b1f897bd68de792ecf1d07f9631b0fbe2a120ed22d18\": rpc error: code = NotFound desc = could not find container \"6d18d29d0e0431f38ad8b1f897bd68de792ecf1d07f9631b0fbe2a120ed22d18\": container with ID starting with 6d18d29d0e0431f38ad8b1f897bd68de792ecf1d07f9631b0fbe2a120ed22d18 not found: ID does not exist" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.686882 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-utilities/0.log" Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.806462 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pln52"] Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.812742 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pln52"] Dec 16 16:06:37 crc kubenswrapper[4728]: I1216 16:06:37.900881 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7wwrk_001d33fe-6bb7-4554-919c-e990321a2590/registry-server/0.log" Dec 16 16:06:38 crc kubenswrapper[4728]: I1216 16:06:38.077145 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-utilities/0.log" Dec 16 16:06:38 crc kubenswrapper[4728]: I1216 16:06:38.158010 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-content/0.log" Dec 16 16:06:38 crc kubenswrapper[4728]: I1216 16:06:38.185665 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-content/0.log" Dec 16 16:06:38 crc kubenswrapper[4728]: I1216 16:06:38.294402 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-utilities/0.log" Dec 16 16:06:38 crc kubenswrapper[4728]: I1216 16:06:38.374694 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/extract-content/0.log" Dec 16 16:06:38 crc kubenswrapper[4728]: I1216 16:06:38.670509 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jlhdb_91b3c438-a0e2-491d-bfb9-44c14a7aa5bd/extract-utilities/0.log" Dec 16 16:06:38 crc kubenswrapper[4728]: I1216 16:06:38.794288 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5t4gg_c35eada5-7775-4d8c-92e3-c744b7f223a1/registry-server/0.log" Dec 16 16:06:38 crc kubenswrapper[4728]: I1216 16:06:38.864530 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jlhdb_91b3c438-a0e2-491d-bfb9-44c14a7aa5bd/extract-utilities/0.log" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.074028 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jlhdb_91b3c438-a0e2-491d-bfb9-44c14a7aa5bd/extract-content/0.log" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.116669 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.117060 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.178275 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.264502 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jlhdb_91b3c438-a0e2-491d-bfb9-44c14a7aa5bd/extract-content/0.log" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.354348 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jlhdb_91b3c438-a0e2-491d-bfb9-44c14a7aa5bd/extract-content/0.log" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.389102 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jlhdb_91b3c438-a0e2-491d-bfb9-44c14a7aa5bd/extract-utilities/0.log" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.402545 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jlhdb_91b3c438-a0e2-491d-bfb9-44c14a7aa5bd/registry-server/0.log" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.517057 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" path="/var/lib/kubelet/pods/12774c70-805e-47d0-9c1f-e0b59a4f9d06/volumes" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.517990 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.548932 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-utilities/0.log" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.689247 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-utilities/0.log" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.752615 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-content/0.log" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.774832 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-content/0.log" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.947377 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-content/0.log" Dec 16 16:06:39 crc kubenswrapper[4728]: I1216 16:06:39.993334 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/extract-utilities/0.log" Dec 16 16:06:40 crc kubenswrapper[4728]: I1216 16:06:40.337058 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cbdqp_926ced6a-c5ef-4bef-ac8f-4e24b9a3adff/registry-server/0.log" Dec 16 16:06:42 crc kubenswrapper[4728]: I1216 16:06:42.173274 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlhdb"] Dec 16 16:06:42 crc kubenswrapper[4728]: I1216 16:06:42.499233 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jlhdb" podUID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" containerName="registry-server" containerID="cri-o://286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3" gracePeriod=2 Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.051679 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.169441 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7xgx\" (UniqueName: \"kubernetes.io/projected/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-kube-api-access-n7xgx\") pod \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.169591 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-utilities\") pod \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.170123 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-utilities" (OuterVolumeSpecName: "utilities") pod "91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" (UID: "91b3c438-a0e2-491d-bfb9-44c14a7aa5bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.170286 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-catalog-content\") pod \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\" (UID: \"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd\") " Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.175242 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-kube-api-access-n7xgx" (OuterVolumeSpecName: "kube-api-access-n7xgx") pod "91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" (UID: "91b3c438-a0e2-491d-bfb9-44c14a7aa5bd"). InnerVolumeSpecName "kube-api-access-n7xgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.179283 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.179316 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7xgx\" (UniqueName: \"kubernetes.io/projected/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-kube-api-access-n7xgx\") on node \"crc\" DevicePath \"\"" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.193026 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" (UID: "91b3c438-a0e2-491d-bfb9-44c14a7aa5bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.281781 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.512588 4728 generic.go:334] "Generic (PLEG): container finished" podID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" containerID="286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3" exitCode=0 Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.512657 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlhdb" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.515610 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlhdb" event={"ID":"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd","Type":"ContainerDied","Data":"286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3"} Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.515644 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlhdb" event={"ID":"91b3c438-a0e2-491d-bfb9-44c14a7aa5bd","Type":"ContainerDied","Data":"ab92706403d5af422496f8cf2cd18e557402ee253820ab733987f6f6f295b6f0"} Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.515662 4728 scope.go:117] "RemoveContainer" containerID="286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.539396 4728 scope.go:117] "RemoveContainer" containerID="b3f71c6e6a2197290b232fa1dcb27561e36a5362d89da2249b63da7988a86412" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.565870 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlhdb"] Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.575631 4728 scope.go:117] "RemoveContainer" containerID="4fac2163ef6d846107ad27349e41ffc6121fc84f0a420033b7cadfa84ab34d1e" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.577858 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlhdb"] Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.612672 4728 scope.go:117] "RemoveContainer" containerID="286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3" Dec 16 16:06:43 crc kubenswrapper[4728]: E1216 16:06:43.613125 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3\": container with ID starting with 286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3 not found: ID does not exist" containerID="286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.613184 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3"} err="failed to get container status \"286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3\": rpc error: code = NotFound desc = could not find container \"286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3\": container with ID starting with 286bbc456150323cda0e1a2f0ab5bbc86fe8d58965ae2f5a7d748dc8bef2bdf3 not found: ID does not exist" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.613210 4728 scope.go:117] "RemoveContainer" containerID="b3f71c6e6a2197290b232fa1dcb27561e36a5362d89da2249b63da7988a86412" Dec 16 16:06:43 crc kubenswrapper[4728]: E1216 16:06:43.613536 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f71c6e6a2197290b232fa1dcb27561e36a5362d89da2249b63da7988a86412\": container with ID starting with b3f71c6e6a2197290b232fa1dcb27561e36a5362d89da2249b63da7988a86412 not found: ID does not exist" containerID="b3f71c6e6a2197290b232fa1dcb27561e36a5362d89da2249b63da7988a86412" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.613630 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f71c6e6a2197290b232fa1dcb27561e36a5362d89da2249b63da7988a86412"} err="failed to get container status \"b3f71c6e6a2197290b232fa1dcb27561e36a5362d89da2249b63da7988a86412\": rpc error: code = NotFound desc = could not find container \"b3f71c6e6a2197290b232fa1dcb27561e36a5362d89da2249b63da7988a86412\": container with ID starting with b3f71c6e6a2197290b232fa1dcb27561e36a5362d89da2249b63da7988a86412 not found: ID does not exist" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.613709 4728 scope.go:117] "RemoveContainer" containerID="4fac2163ef6d846107ad27349e41ffc6121fc84f0a420033b7cadfa84ab34d1e" Dec 16 16:06:43 crc kubenswrapper[4728]: E1216 16:06:43.614014 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fac2163ef6d846107ad27349e41ffc6121fc84f0a420033b7cadfa84ab34d1e\": container with ID starting with 4fac2163ef6d846107ad27349e41ffc6121fc84f0a420033b7cadfa84ab34d1e not found: ID does not exist" containerID="4fac2163ef6d846107ad27349e41ffc6121fc84f0a420033b7cadfa84ab34d1e" Dec 16 16:06:43 crc kubenswrapper[4728]: I1216 16:06:43.614033 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fac2163ef6d846107ad27349e41ffc6121fc84f0a420033b7cadfa84ab34d1e"} err="failed to get container status \"4fac2163ef6d846107ad27349e41ffc6121fc84f0a420033b7cadfa84ab34d1e\": rpc error: code = NotFound desc = could not find container \"4fac2163ef6d846107ad27349e41ffc6121fc84f0a420033b7cadfa84ab34d1e\": container with ID starting with 4fac2163ef6d846107ad27349e41ffc6121fc84f0a420033b7cadfa84ab34d1e not found: ID does not exist" Dec 16 16:06:45 crc kubenswrapper[4728]: I1216 16:06:45.516619 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" path="/var/lib/kubelet/pods/91b3c438-a0e2-491d-bfb9-44c14a7aa5bd/volumes" Dec 16 16:08:22 crc kubenswrapper[4728]: I1216 16:08:22.494398 4728 generic.go:334] "Generic (PLEG): container finished" podID="11afde6c-0804-4567-bfca-9495c69e47c1" containerID="45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8" exitCode=0 Dec 16 16:08:22 crc kubenswrapper[4728]: I1216 16:08:22.494581 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7mmv/must-gather-vxp82" event={"ID":"11afde6c-0804-4567-bfca-9495c69e47c1","Type":"ContainerDied","Data":"45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8"} Dec 16 16:08:22 crc kubenswrapper[4728]: I1216 16:08:22.495641 4728 scope.go:117] "RemoveContainer" containerID="45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8" Dec 16 16:08:22 crc kubenswrapper[4728]: I1216 16:08:22.807275 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g7mmv_must-gather-vxp82_11afde6c-0804-4567-bfca-9495c69e47c1/gather/0.log" Dec 16 16:08:33 crc kubenswrapper[4728]: I1216 16:08:33.827323 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g7mmv/must-gather-vxp82"] Dec 16 16:08:33 crc kubenswrapper[4728]: I1216 16:08:33.828276 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-g7mmv/must-gather-vxp82" podUID="11afde6c-0804-4567-bfca-9495c69e47c1" containerName="copy" containerID="cri-o://edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9" gracePeriod=2 Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:33.835754 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g7mmv/must-gather-vxp82"] Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.434314 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g7mmv_must-gather-vxp82_11afde6c-0804-4567-bfca-9495c69e47c1/copy/0.log" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.435073 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/must-gather-vxp82" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.569955 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/11afde6c-0804-4567-bfca-9495c69e47c1-must-gather-output\") pod \"11afde6c-0804-4567-bfca-9495c69e47c1\" (UID: \"11afde6c-0804-4567-bfca-9495c69e47c1\") " Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.570710 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbvg6\" (UniqueName: \"kubernetes.io/projected/11afde6c-0804-4567-bfca-9495c69e47c1-kube-api-access-fbvg6\") pod \"11afde6c-0804-4567-bfca-9495c69e47c1\" (UID: \"11afde6c-0804-4567-bfca-9495c69e47c1\") " Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.578921 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11afde6c-0804-4567-bfca-9495c69e47c1-kube-api-access-fbvg6" (OuterVolumeSpecName: "kube-api-access-fbvg6") pod "11afde6c-0804-4567-bfca-9495c69e47c1" (UID: "11afde6c-0804-4567-bfca-9495c69e47c1"). InnerVolumeSpecName "kube-api-access-fbvg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.617063 4728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g7mmv_must-gather-vxp82_11afde6c-0804-4567-bfca-9495c69e47c1/copy/0.log" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.617462 4728 generic.go:334] "Generic (PLEG): container finished" podID="11afde6c-0804-4567-bfca-9495c69e47c1" containerID="edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9" exitCode=143 Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.617519 4728 scope.go:117] "RemoveContainer" containerID="edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.617660 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7mmv/must-gather-vxp82" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.649302 4728 scope.go:117] "RemoveContainer" containerID="45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.673656 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbvg6\" (UniqueName: \"kubernetes.io/projected/11afde6c-0804-4567-bfca-9495c69e47c1-kube-api-access-fbvg6\") on node \"crc\" DevicePath \"\"" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.714728 4728 scope.go:117] "RemoveContainer" containerID="edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9" Dec 16 16:08:34 crc kubenswrapper[4728]: E1216 16:08:34.715238 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9\": container with ID starting with edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9 not found: ID does not exist" containerID="edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.715290 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9"} err="failed to get container status \"edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9\": rpc error: code = NotFound desc = could not find container \"edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9\": container with ID starting with edc0b6dbb00414216c6ad45be94ef228072b60b0f61cfeeaafc241b5f5a0a4f9 not found: ID does not exist" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.715310 4728 scope.go:117] "RemoveContainer" containerID="45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8" Dec 16 16:08:34 crc kubenswrapper[4728]: E1216 16:08:34.715868 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8\": container with ID starting with 45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8 not found: ID does not exist" containerID="45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.715984 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8"} err="failed to get container status \"45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8\": rpc error: code = NotFound desc = could not find container \"45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8\": container with ID starting with 45ec87e33aa7473c834e20c22ac28b906144a2cfddbe8af8aee8e16dc3afecc8 not found: ID does not exist" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.741275 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11afde6c-0804-4567-bfca-9495c69e47c1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "11afde6c-0804-4567-bfca-9495c69e47c1" (UID: "11afde6c-0804-4567-bfca-9495c69e47c1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:08:34 crc kubenswrapper[4728]: I1216 16:08:34.774278 4728 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/11afde6c-0804-4567-bfca-9495c69e47c1-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 16 16:08:35 crc kubenswrapper[4728]: I1216 16:08:35.517701 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11afde6c-0804-4567-bfca-9495c69e47c1" path="/var/lib/kubelet/pods/11afde6c-0804-4567-bfca-9495c69e47c1/volumes" Dec 16 16:08:38 crc kubenswrapper[4728]: I1216 16:08:38.818523 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:08:38 crc kubenswrapper[4728]: I1216 16:08:38.818966 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:08:51 crc kubenswrapper[4728]: I1216 16:08:51.787313 4728 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="76f2644a-8bb9-4719-83dd-429202a52446" containerName="galera" probeResult="failure" output="command timed out" Dec 16 16:09:08 crc kubenswrapper[4728]: I1216 16:09:08.818835 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:09:08 crc kubenswrapper[4728]: I1216 16:09:08.820352 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:09:38 crc kubenswrapper[4728]: I1216 16:09:38.819060 4728 patch_prober.go:28] interesting pod/machine-config-daemon-njzmx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 16 16:09:38 crc kubenswrapper[4728]: I1216 16:09:38.820553 4728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 16 16:09:38 crc kubenswrapper[4728]: I1216 16:09:38.820706 4728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" Dec 16 16:09:38 crc kubenswrapper[4728]: I1216 16:09:38.821661 4728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d42f638dc3af8bacb277ff5376bc18296ad870667edf5b251b48b95a53881011"} pod="openshift-machine-config-operator/machine-config-daemon-njzmx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 16 16:09:38 crc kubenswrapper[4728]: I1216 16:09:38.821821 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" podUID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerName="machine-config-daemon" containerID="cri-o://d42f638dc3af8bacb277ff5376bc18296ad870667edf5b251b48b95a53881011" gracePeriod=600 Dec 16 16:09:39 crc kubenswrapper[4728]: I1216 16:09:39.212795 4728 generic.go:334] "Generic (PLEG): container finished" podID="d5cdc17e-067e-4d74-b768-02966221d3ae" containerID="d42f638dc3af8bacb277ff5376bc18296ad870667edf5b251b48b95a53881011" exitCode=0 Dec 16 16:09:39 crc kubenswrapper[4728]: I1216 16:09:39.212893 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerDied","Data":"d42f638dc3af8bacb277ff5376bc18296ad870667edf5b251b48b95a53881011"} Dec 16 16:09:39 crc kubenswrapper[4728]: I1216 16:09:39.213706 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-njzmx" event={"ID":"d5cdc17e-067e-4d74-b768-02966221d3ae","Type":"ContainerStarted","Data":"17b1d0a1ee26c551f15bea7fd8fe964efa7d77cde271f9eb88b497b870659496"} Dec 16 16:09:39 crc kubenswrapper[4728]: I1216 16:09:39.213752 4728 scope.go:117] "RemoveContainer" containerID="dda3ed2f75dae54a62b7761e594edda8cee5d51440d323dc578e83071c52a51d" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.704126 4728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2jtg2"] Dec 16 16:10:39 crc kubenswrapper[4728]: E1216 16:10:39.705126 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" containerName="extract-utilities" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705140 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" containerName="extract-utilities" Dec 16 16:10:39 crc kubenswrapper[4728]: E1216 16:10:39.705151 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11afde6c-0804-4567-bfca-9495c69e47c1" containerName="copy" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705157 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="11afde6c-0804-4567-bfca-9495c69e47c1" containerName="copy" Dec 16 16:10:39 crc kubenswrapper[4728]: E1216 16:10:39.705168 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" containerName="extract-content" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705174 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" containerName="extract-content" Dec 16 16:10:39 crc kubenswrapper[4728]: E1216 16:10:39.705196 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" containerName="extract-content" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705202 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" containerName="extract-content" Dec 16 16:10:39 crc kubenswrapper[4728]: E1216 16:10:39.705222 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" containerName="registry-server" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705228 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" containerName="registry-server" Dec 16 16:10:39 crc kubenswrapper[4728]: E1216 16:10:39.705238 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" containerName="registry-server" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705245 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" containerName="registry-server" Dec 16 16:10:39 crc kubenswrapper[4728]: E1216 16:10:39.705260 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11afde6c-0804-4567-bfca-9495c69e47c1" containerName="gather" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705265 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="11afde6c-0804-4567-bfca-9495c69e47c1" containerName="gather" Dec 16 16:10:39 crc kubenswrapper[4728]: E1216 16:10:39.705273 4728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" containerName="extract-utilities" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705279 4728 state_mem.go:107] "Deleted CPUSet assignment" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" containerName="extract-utilities" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705475 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="12774c70-805e-47d0-9c1f-e0b59a4f9d06" containerName="registry-server" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705487 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="11afde6c-0804-4567-bfca-9495c69e47c1" containerName="gather" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705501 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="11afde6c-0804-4567-bfca-9495c69e47c1" containerName="copy" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.705512 4728 memory_manager.go:354] "RemoveStaleState removing state" podUID="91b3c438-a0e2-491d-bfb9-44c14a7aa5bd" containerName="registry-server" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.707020 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.716451 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2jtg2"] Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.750474 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-catalog-content\") pod \"community-operators-2jtg2\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.750705 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b9z6\" (UniqueName: \"kubernetes.io/projected/018117a4-a716-45a9-87a7-59f167847125-kube-api-access-8b9z6\") pod \"community-operators-2jtg2\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.750777 4728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-utilities\") pod \"community-operators-2jtg2\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.852695 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b9z6\" (UniqueName: \"kubernetes.io/projected/018117a4-a716-45a9-87a7-59f167847125-kube-api-access-8b9z6\") pod \"community-operators-2jtg2\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.852769 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-utilities\") pod \"community-operators-2jtg2\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.852824 4728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-catalog-content\") pod \"community-operators-2jtg2\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.853307 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-utilities\") pod \"community-operators-2jtg2\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:39 crc kubenswrapper[4728]: I1216 16:10:39.853321 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-catalog-content\") pod \"community-operators-2jtg2\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:40 crc kubenswrapper[4728]: I1216 16:10:40.001262 4728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b9z6\" (UniqueName: \"kubernetes.io/projected/018117a4-a716-45a9-87a7-59f167847125-kube-api-access-8b9z6\") pod \"community-operators-2jtg2\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:40 crc kubenswrapper[4728]: I1216 16:10:40.033622 4728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:40 crc kubenswrapper[4728]: I1216 16:10:40.511724 4728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2jtg2"] Dec 16 16:10:41 crc kubenswrapper[4728]: I1216 16:10:41.302870 4728 generic.go:334] "Generic (PLEG): container finished" podID="018117a4-a716-45a9-87a7-59f167847125" containerID="e7333f4a1119123af8a75e8d2c621f524be5d0b13be6b25d133754a68763edc9" exitCode=0 Dec 16 16:10:41 crc kubenswrapper[4728]: I1216 16:10:41.302915 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jtg2" event={"ID":"018117a4-a716-45a9-87a7-59f167847125","Type":"ContainerDied","Data":"e7333f4a1119123af8a75e8d2c621f524be5d0b13be6b25d133754a68763edc9"} Dec 16 16:10:41 crc kubenswrapper[4728]: I1216 16:10:41.303134 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jtg2" event={"ID":"018117a4-a716-45a9-87a7-59f167847125","Type":"ContainerStarted","Data":"364faee0a8ffdd72bdb92739b7d8a5181ebe6515971f7e3826bc2c9babfdf8f8"} Dec 16 16:10:43 crc kubenswrapper[4728]: I1216 16:10:43.322134 4728 generic.go:334] "Generic (PLEG): container finished" podID="018117a4-a716-45a9-87a7-59f167847125" containerID="801991d73c445c3bea43e20e434187b4f0a59e1aed130c990bb7f3b1d11dff2a" exitCode=0 Dec 16 16:10:43 crc kubenswrapper[4728]: I1216 16:10:43.322191 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jtg2" event={"ID":"018117a4-a716-45a9-87a7-59f167847125","Type":"ContainerDied","Data":"801991d73c445c3bea43e20e434187b4f0a59e1aed130c990bb7f3b1d11dff2a"} Dec 16 16:10:44 crc kubenswrapper[4728]: I1216 16:10:44.333739 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jtg2" event={"ID":"018117a4-a716-45a9-87a7-59f167847125","Type":"ContainerStarted","Data":"3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1"} Dec 16 16:10:44 crc kubenswrapper[4728]: I1216 16:10:44.361470 4728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2jtg2" podStartSLOduration=2.793859713 podStartE2EDuration="5.361452961s" podCreationTimestamp="2025-12-16 16:10:39 +0000 UTC" firstStartedPulling="2025-12-16 16:10:41.305329447 +0000 UTC m=+4422.145508461" lastFinishedPulling="2025-12-16 16:10:43.872922695 +0000 UTC m=+4424.713101709" observedRunningTime="2025-12-16 16:10:44.357232728 +0000 UTC m=+4425.197411792" watchObservedRunningTime="2025-12-16 16:10:44.361452961 +0000 UTC m=+4425.201631945" Dec 16 16:10:50 crc kubenswrapper[4728]: I1216 16:10:50.034303 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:50 crc kubenswrapper[4728]: I1216 16:10:50.034925 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:50 crc kubenswrapper[4728]: I1216 16:10:50.092275 4728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:50 crc kubenswrapper[4728]: I1216 16:10:50.472131 4728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:50 crc kubenswrapper[4728]: I1216 16:10:50.541088 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2jtg2"] Dec 16 16:10:52 crc kubenswrapper[4728]: I1216 16:10:52.427097 4728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2jtg2" podUID="018117a4-a716-45a9-87a7-59f167847125" containerName="registry-server" containerID="cri-o://3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1" gracePeriod=2 Dec 16 16:10:52 crc kubenswrapper[4728]: I1216 16:10:52.890127 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:52 crc kubenswrapper[4728]: I1216 16:10:52.961906 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b9z6\" (UniqueName: \"kubernetes.io/projected/018117a4-a716-45a9-87a7-59f167847125-kube-api-access-8b9z6\") pod \"018117a4-a716-45a9-87a7-59f167847125\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " Dec 16 16:10:52 crc kubenswrapper[4728]: I1216 16:10:52.961956 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-utilities\") pod \"018117a4-a716-45a9-87a7-59f167847125\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " Dec 16 16:10:52 crc kubenswrapper[4728]: I1216 16:10:52.962073 4728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-catalog-content\") pod \"018117a4-a716-45a9-87a7-59f167847125\" (UID: \"018117a4-a716-45a9-87a7-59f167847125\") " Dec 16 16:10:52 crc kubenswrapper[4728]: I1216 16:10:52.962791 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-utilities" (OuterVolumeSpecName: "utilities") pod "018117a4-a716-45a9-87a7-59f167847125" (UID: "018117a4-a716-45a9-87a7-59f167847125"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:10:52 crc kubenswrapper[4728]: I1216 16:10:52.967094 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018117a4-a716-45a9-87a7-59f167847125-kube-api-access-8b9z6" (OuterVolumeSpecName: "kube-api-access-8b9z6") pod "018117a4-a716-45a9-87a7-59f167847125" (UID: "018117a4-a716-45a9-87a7-59f167847125"). InnerVolumeSpecName "kube-api-access-8b9z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.022689 4728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "018117a4-a716-45a9-87a7-59f167847125" (UID: "018117a4-a716-45a9-87a7-59f167847125"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.064798 4728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.064846 4728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b9z6\" (UniqueName: \"kubernetes.io/projected/018117a4-a716-45a9-87a7-59f167847125-kube-api-access-8b9z6\") on node \"crc\" DevicePath \"\"" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.064857 4728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/018117a4-a716-45a9-87a7-59f167847125-utilities\") on node \"crc\" DevicePath \"\"" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.439514 4728 generic.go:334] "Generic (PLEG): container finished" podID="018117a4-a716-45a9-87a7-59f167847125" containerID="3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1" exitCode=0 Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.439565 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jtg2" event={"ID":"018117a4-a716-45a9-87a7-59f167847125","Type":"ContainerDied","Data":"3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1"} Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.439608 4728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jtg2" event={"ID":"018117a4-a716-45a9-87a7-59f167847125","Type":"ContainerDied","Data":"364faee0a8ffdd72bdb92739b7d8a5181ebe6515971f7e3826bc2c9babfdf8f8"} Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.439629 4728 scope.go:117] "RemoveContainer" containerID="3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.440507 4728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jtg2" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.465176 4728 scope.go:117] "RemoveContainer" containerID="801991d73c445c3bea43e20e434187b4f0a59e1aed130c990bb7f3b1d11dff2a" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.502588 4728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2jtg2"] Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.504310 4728 scope.go:117] "RemoveContainer" containerID="e7333f4a1119123af8a75e8d2c621f524be5d0b13be6b25d133754a68763edc9" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.524035 4728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2jtg2"] Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.558312 4728 scope.go:117] "RemoveContainer" containerID="3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1" Dec 16 16:10:53 crc kubenswrapper[4728]: E1216 16:10:53.558990 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1\": container with ID starting with 3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1 not found: ID does not exist" containerID="3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.559130 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1"} err="failed to get container status \"3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1\": rpc error: code = NotFound desc = could not find container \"3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1\": container with ID starting with 3a2a87aa1c09c1171953d2f4c82516c5832e916f0b88e3985b48f34d52bdf0f1 not found: ID does not exist" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.559257 4728 scope.go:117] "RemoveContainer" containerID="801991d73c445c3bea43e20e434187b4f0a59e1aed130c990bb7f3b1d11dff2a" Dec 16 16:10:53 crc kubenswrapper[4728]: E1216 16:10:53.559804 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"801991d73c445c3bea43e20e434187b4f0a59e1aed130c990bb7f3b1d11dff2a\": container with ID starting with 801991d73c445c3bea43e20e434187b4f0a59e1aed130c990bb7f3b1d11dff2a not found: ID does not exist" containerID="801991d73c445c3bea43e20e434187b4f0a59e1aed130c990bb7f3b1d11dff2a" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.559842 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"801991d73c445c3bea43e20e434187b4f0a59e1aed130c990bb7f3b1d11dff2a"} err="failed to get container status \"801991d73c445c3bea43e20e434187b4f0a59e1aed130c990bb7f3b1d11dff2a\": rpc error: code = NotFound desc = could not find container \"801991d73c445c3bea43e20e434187b4f0a59e1aed130c990bb7f3b1d11dff2a\": container with ID starting with 801991d73c445c3bea43e20e434187b4f0a59e1aed130c990bb7f3b1d11dff2a not found: ID does not exist" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.559867 4728 scope.go:117] "RemoveContainer" containerID="e7333f4a1119123af8a75e8d2c621f524be5d0b13be6b25d133754a68763edc9" Dec 16 16:10:53 crc kubenswrapper[4728]: E1216 16:10:53.560199 4728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7333f4a1119123af8a75e8d2c621f524be5d0b13be6b25d133754a68763edc9\": container with ID starting with e7333f4a1119123af8a75e8d2c621f524be5d0b13be6b25d133754a68763edc9 not found: ID does not exist" containerID="e7333f4a1119123af8a75e8d2c621f524be5d0b13be6b25d133754a68763edc9" Dec 16 16:10:53 crc kubenswrapper[4728]: I1216 16:10:53.560337 4728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7333f4a1119123af8a75e8d2c621f524be5d0b13be6b25d133754a68763edc9"} err="failed to get container status \"e7333f4a1119123af8a75e8d2c621f524be5d0b13be6b25d133754a68763edc9\": rpc error: code = NotFound desc = could not find container \"e7333f4a1119123af8a75e8d2c621f524be5d0b13be6b25d133754a68763edc9\": container with ID starting with e7333f4a1119123af8a75e8d2c621f524be5d0b13be6b25d133754a68763edc9 not found: ID does not exist" Dec 16 16:10:55 crc kubenswrapper[4728]: I1216 16:10:55.519063 4728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="018117a4-a716-45a9-87a7-59f167847125" path="/var/lib/kubelet/pods/018117a4-a716-45a9-87a7-59f167847125/volumes"